People

Latest Research Publications:
Complex societal problems require a multi-disciplinary and multi-method approach to develop models that can support the development of solutions. General morphological analysis is a qualitative method to extract information from experts through facilitation and the use of customized software. Ontologies provide semantic representation of knowledge bases together with automated reasoning capabilities. These two approaches, combined with the use of concept maps, provide an integrated approach which can be used to understand complex and ill-structured problem domains and to aid in business modelling, strategy and scenario development and finally, decision-making. The resulting models are subjective constructs reflecting the knowledge and understanding of the analysts. Subsequent synthesis of new understanding and decisions rely on the robust validation and verification of the underlying logic and assumptions of the conceptual models.
Morphological Analysis and ontological constructs are applied in terms of an integrated Morphological Ontology Design Engineering methodology (MODE), which is based on Design Science. The paper is developed around the opportunity of scoping the applied research competence required to support a nation’s progress toward energy sufficiency. This paper presents a complex fused model for national energy sufficiency in New Zealand. The approach can be used to address other ill- structured complex societal problems.
@{375, author = {JH Roodt, Louise Leenen, Jansen van Vuuren}, title = {Modelling Of The Complex Societal Problem Of Establishing A National Energy Sufficiency Competence}, abstract = {Complex societal problems require a multi-disciplinary and multi-method approach to develop models that can support the development of solutions. General morphological analysis is a qualitative method to extract information from experts through facilitation and the use of customized software. Ontologies provide semantic representation of knowledge bases together with automated reasoning capabilities. These two approaches, combined with the use of concept maps, provide an integrated approach which can be used to understand complex and ill-structured problem domains and to aid in business modelling, strategy and scenario development and finally, decision-making. The resulting models are subjective constructs reflecting the knowledge and understanding of the analysts. Subsequent synthesis of new understanding and decisions rely on the robust validation and verification of the underlying logic and assumptions of the conceptual models. Morphological Analysis and ontological constructs are applied in terms of an integrated Morphological Ontology Design Engineering methodology (MODE), which is based on Design Science. The paper is developed around the opportunity of scoping the applied research competence required to support a nation’s progress toward energy sufficiency. This paper presents a complex fused model for national energy sufficiency in New Zealand. The approach can be used to address other ill- structured complex societal problems.}, year = {2020}, journal = {23rd International Conference on Information Fusion}, pages = {880 - 887}, month = {06/07-09/07}, isbn = {978-0-578-64709-8}, }
The protection and management of data, and especially personal information, is becoming an issue of critical importance in both the business environment and in general society. Various institutions have justifiable reasons to gather the personal information of individuals but they are required to comply with any legislation involving the processing of such data. Organisations thus face legal and other repercussions should personal information be breached or treated negligently. Most countries have adopted privacy and data protection laws or are in the process of enacting such laws. In South Africa, the Protection of Privacy Information Act (POPIA) was formally adopted in 2013 but it is yet to be implemented. When the implementation of the Act is announced, role players (responsible parties and data subjects) affected by POPIA will have a grace period of a year to become compliant and/or understand how the Act will affect them. One example of a mandate that follows from POPIA is data breach notification. This paper presents the development of a prototype ontology on POPIA to promote transparency and education of affected data subjects and organisations including government departments. The ontology provides a semantic representation of a knowledge base for the regulations in the POPIA and how it affects these role players. The POPIA is closely aligned with the European Union’s General Data Protection Regulation (GDPR), and the POPIA ontology is inspired by similar ontologies developed for the GDPR.
@{374, author = {Y Jafta, Louise Leenen, P Chan}, title = {An Ontology for the South African Protection of Personal Information Act}, abstract = {The protection and management of data, and especially personal information, is becoming an issue of critical importance in both the business environment and in general society. Various institutions have justifiable reasons to gather the personal information of individuals but they are required to comply with any legislation involving the processing of such data. Organisations thus face legal and other repercussions should personal information be breached or treated negligently. Most countries have adopted privacy and data protection laws or are in the process of enacting such laws. In South Africa, the Protection of Privacy Information Act (POPIA) was formally adopted in 2013 but it is yet to be implemented. When the implementation of the Act is announced, role players (responsible parties and data subjects) affected by POPIA will have a grace period of a year to become compliant and/or understand how the Act will affect them. One example of a mandate that follows from POPIA is data breach notification. This paper presents the development of a prototype ontology on POPIA to promote transparency and education of affected data subjects and organisations including government departments. The ontology provides a semantic representation of a knowledge base for the regulations in the POPIA and how it affects these role players. The POPIA is closely aligned with the European Union’s General Data Protection Regulation (GDPR), and the POPIA ontology is inspired by similar ontologies developed for the GDPR.}, year = {2020}, journal = {The 19th European Conference on Cyber Warfare and Security}, pages = {158 - 176}, month = {25/06 - 26/06}, publisher = {Academic Conferences and Publishing International Limited}, address = {UK}, isbn = {978-1-912764-61-7}, }
In the recent past, some Internet users questioned the reliability of online news, but not necessarily the role of search engines in programming public discourse. In 2018, South African Twitter users accused Google of peddling misinformation when Google Image searches for the phrase “squatter camps in South Africa” displayed images of white squatter camps. Many ana-lysts blamed Google’s algorithm for displaying bias. In this article, the authors use this example in comparing the findings of six different search engines to counter this argument. Search engines that are diverse in their scope and origin are used to prove that is it not the algorithm, but rather the data that is biased.
@article{373, author = {Jansen van Vuuren, Louise Leenen}, title = {Proving It Is the Data That Is Biased, Not the Algorithm Through a Recent South African Online Case Study}, abstract = {In the recent past, some Internet users questioned the reliability of online news, but not necessarily the role of search engines in programming public discourse. In 2018, South African Twitter users accused Google of peddling misinformation when Google Image searches for the phrase “squatter camps in South Africa” displayed images of white squatter camps. Many ana-lysts blamed Google’s algorithm for displaying bias. In this article, the authors use this example in comparing the findings of six different search engines to counter this argument. Search engines that are diverse in their scope and origin are used to prove that is it not the algorithm, but rather the data that is biased.}, year = {2020}, journal = {Journal of Information Warfare}, volume = {19}, pages = {118-129}, issue = {3}, publisher = {Peregrine Technical Solutions}, address = {Virginia, USA}, isbn = {1445-3312}, }
Cybercrime is increasing at a rate few individuals would have predicted. IBM estimated in 2016 that, in 2019, the cost of cybercrime would reach $2 trillion, a threefold increase from the 2015 estimate of $500 billion. The growth of the Internet and the rapid development of technology provide enormous economic and social benefits but at the same time provide platforms for cybercriminals to exploit. Organised crime is using more sophisticated techniques, which require highly skilled and specialised law enforcement responses. One example is the use of cryptocurrencies, which makes it easier for cybercriminals to hide their proceeds. Regulatory measures often lag behind.
In this paper, the authors give an overview of the growing threat of cybercrime with a specific focus on high levels of cybercrime in Africa. The focus then turns to the development of national cybercrime strategies and implementation. Results from literature and the authors’ analyses of two cyber indices to measure the capabilities and capacities of countries are combined to present a framework for the development of a cybercrime strategy, and in particular, a strategy customised for African countries.
@article{372, author = {Jansen van Vuuren, Louise Leenen, P Pieterse}, title = {Development and Implementation of Cybercrime Strategies in Africa with Specific Reference to South Africa}, abstract = {Cybercrime is increasing at a rate few individuals would have predicted. IBM estimated in 2016 that, in 2019, the cost of cybercrime would reach $2 trillion, a threefold increase from the 2015 estimate of $500 billion. The growth of the Internet and the rapid development of technology provide enormous economic and social benefits but at the same time provide platforms for cybercriminals to exploit. Organised crime is using more sophisticated techniques, which require highly skilled and specialised law enforcement responses. One example is the use of cryptocurrencies, which makes it easier for cybercriminals to hide their proceeds. Regulatory measures often lag behind. In this paper, the authors give an overview of the growing threat of cybercrime with a specific focus on high levels of cybercrime in Africa. The focus then turns to the development of national cybercrime strategies and implementation. Results from literature and the authors’ analyses of two cyber indices to measure the capabilities and capacities of countries are combined to present a framework for the development of a cybercrime strategy, and in particular, a strategy customised for African countries.}, year = {2020}, journal = {Journal of Information Warfare}, volume = {19}, pages = {83 - 101}, issue = {3}, publisher = {Peregrine Technical Solutions}, address = {Virginia, USA}, isbn = {1445-3312}, }
Cybersecurity is often incorrectly assumed to be a purely technical field; however, there are numerous multidisciplinary aspects, such as, for example, human factors, legal, and governance issues. The broad scope, combined with other historical or bureaucratic factors, can provide challenges to researchers and students where appropriate methodologies do not necessarily conform to traditional disciplinary norms; prejudice against research approaches can occur as a result of ‘old school thought’. This paper aims to investigate the South African national and institutional perspectives for higher education and research, identify challenges, and propose solutions to facilitate multidisciplinary research into cybersecurity and Information Warfare (IW) in South Africa.
@article{371, author = {T Ramluckan, B van Niekerk, Louise Leenen}, title = {Cybersecurity and Information Warfare Research in South Africa: Challenges and Proposed Solutions}, abstract = {Cybersecurity is often incorrectly assumed to be a purely technical field; however, there are numerous multidisciplinary aspects, such as, for example, human factors, legal, and governance issues. The broad scope, combined with other historical or bureaucratic factors, can provide challenges to researchers and students where appropriate methodologies do not necessarily conform to traditional disciplinary norms; prejudice against research approaches can occur as a result of ‘old school thought’. This paper aims to investigate the South African national and institutional perspectives for higher education and research, identify challenges, and propose solutions to facilitate multidisciplinary research into cybersecurity and Information Warfare (IW) in South Africa.}, year = {2020}, journal = {Journal of Information Warfare}, volume = {19}, pages = {80-95}, issue = {1}, publisher = {Peregrine Technical Solutions}, address = {Virginia, USA}, isbn = {ISSN 1445-3312}, }

TALKS:
1) 'What role can consciousness play in the building of artificial moral agents?' (CAIR/UP Symposium 2019);
2) 'Artificial moral agency: Philosophical challenges and design considerations' (FAIR 2018).
PUBLICATIONS:
1) Mabaso, B. A. (2020b). Computationally rational agents can be moral agents. 'Ethics and Information Technology'. doi:10.1007/s10676-020-09527-1;
2) Mabaso, B. A. (2020a). Artificial Moral Agents Within an Ethos of AI4SG. 'Philosophy and Technology'. doi:10.1007/s13347-020-00400-z.
Latest Research Publications:

DEGREES LINKED TO THIS RESEARCH GROUP:
1) 2020 - current PhD (Philosophy): (provisional title) 'Human Dignity as Deciding Factor for Artificial Morality';
2) 2018-2019 Master of Arts (Philosophy): 'Ethics of AI: Exploring a Virtue Ethics Framework for Lethal Autonomous Weapon Systems';
3) 2017 BA (Honours) (Philosophy): 'The Possibility of Ascribing Artificial Moral Agency: The Case of Self-driving Cars'.
TALKS:
1) 'Virtue ethics as a solution to artificial moral reasoning in the context of lethal autonomous weapon systems' (Forum for Artificial Intelligence Research (FAIR 2019));
2) 'Virtue ethics as a solution to artificial moral reasoning in the context of lethal autonomous weapon systems' (Fourth Industrial Revolution: Philosophical, Ethical and Legal Dimensions Conference 2019);
3) 'The use and abuse of autonomous weapons systems' (Amnesty International (UP Chapter) 2019 symposium);
4) 'An investigation on the moral status and use of Autonomous Weapons Systems in warfare' (PSSA 2019);
5) 'Responsible belief as a moral obligation and the quest for fair and transparent machine learning' (co-presented with Emma Ruttkamp-Bloem, Digital Humanities Southern Africa (DHASA));
6) 'Virtue ethics as a solution for artificial moral reasoning in autonomous weapons' (FAIR 2018);
7) Talks at CAIR/UP Symposium 2019 and 2018.
WORKSHOP ATTENDANCE:
1) 16-17 August 2018 – African states conference on nuclear weapons and autonomous weapons;
2) 09 July 2018 – African Seminar on lethal autonomous weapons systems.
Latest Research Publications:

DEGREES LINKED TO THIS RESEARCH GROUP:
1) 2017-2019 PhD (Philosophy): 'A Novel Reply to the Knowledge Argument: Wiredu’s View of Quasi-physicalism as a Positive Reply to Jackson'.
TALKS:
1) 'A response to Masaka, D. (2018) "Person, Personhood and Individual rights in Menkiti's African Communitarian thinking - Theoria 157, Vol 65. No 4' (APTA summer school- University of Fort Hare 1-4 December 2018);
2) 'Re-visiting the Chinese Room: A Response to Boden’s critique of Searle’s positive claim' (Philosophy postgraduate conference PPA 2018 - University of Pretoria);
3) 'A reflection on the knowledge problem for reductive physicalism and the impact of epistemic injustice on the quality of knowledge representation and reasoning in the context of the mind-body problem' (Humanities postgraduate conference 2019 - University of Pretoria);
4) 'A Novel response to the knowledge argument in support of non-reductive physicalism' (PSSA 2020 - University of KwaZulu-Natal).
Latest Research Publications:
Latest Research Publications:

He is always looking for good postgraduate students to join the KRR group. If you have an interest in logic-based Artificial Intelligence, please contact him. Details about postgraduate admission in Computer Science at UCT can be found here: http://www.sit.uct.ac.za/sit/postgrad/overview
Latest Research Publications:
Belief revision and belief update are approaches to represent and reason with knowledge in artificial intelligence. Previous empirical studies have shown that human reasoning is consistent with non-monotonic logic and postulates of defeasible reasoning, belief revision and belief update. We extended previous work, which tested natural language translations of the postulates of defeasible reasoning, belief revision and belief update with human reasoners via surveys, in three respects.
Firstly, we only tested postulates of belief revision and belief update, taking the position that belief change aligns more with human reasoning than non-monotonic defeasible reasoning. Secondly, we decomposed the postulates of revision and update into material implication statements of the form “If x is the case, then y is the case”, each containing a premise
and a conclusion, and then translated the premises and conclusions into natural language. Thirdly, we asked human participants to judge each component of the postulate for plausibility. In our analysis, we measured the strength of the association between the premises and the conclusion of each postulate. We used Possibility theory to determine whether the postulates hold with our participants in general. Our results showed that our participants’ reasoning is consistent with postulates of belief
revision and belief update when judging the premises and conclusion of the postulate separately.
@{427, author = {Clayton Baker, Tommie Meyer}, title = {Belief Change in Human Reasoning: An Empirical Investigation on MTurk}, abstract = {Belief revision and belief update are approaches to represent and reason with knowledge in artificial intelligence. Previous empirical studies have shown that human reasoning is consistent with non-monotonic logic and postulates of defeasible reasoning, belief revision and belief update. We extended previous work, which tested natural language translations of the postulates of defeasible reasoning, belief revision and belief update with human reasoners via surveys, in three respects. Firstly, we only tested postulates of belief revision and belief update, taking the position that belief change aligns more with human reasoning than non-monotonic defeasible reasoning. Secondly, we decomposed the postulates of revision and update into material implication statements of the form “If x is the case, then y is the case”, each containing a premise and a conclusion, and then translated the premises and conclusions into natural language. Thirdly, we asked human participants to judge each component of the postulate for plausibility. In our analysis, we measured the strength of the association between the premises and the conclusion of each postulate. We used Possibility theory to determine whether the postulates hold with our participants in general. Our results showed that our participants’ reasoning is consistent with postulates of belief revision and belief update when judging the premises and conclusion of the postulate separately.}, year = {2022}, journal = {Second Southern African Conference for AI Research (SACAIR 2022)}, pages = {218-234}, month = {06/12/2021-10/12/2021}, publisher = {SACAIR 2021 Organising Committee}, address = {Online}, isbn = {978-0-620-94410-6}, url = {https://2021.sacair.org.za/proceedings/}, }
Explanation services are a crucial aspect of symbolic reasoning systems but they have not been explored in detail for defeasible formalisms such as KLM. We evaluate prior work on the topic with a focus on KLM propositional logic and find that a form of defeasible explanation initially described for Rational Closure which we term weak justification can be adapted to Relevant and Lexicographic Closure as well as described in terms of intuitive properties derived from the KLM postulates. We also consider how a more general definition of defeasible explanation known as strong explanation applies to KLM and propose an algorithm that enumerates these justifications for Rational Closure.
@inbook{426, author = {Lloyd Everett, Emily Morris, Tommie Meyer}, title = {Explanation for KLM-Style Defeasible Reasoning}, abstract = {Explanation services are a crucial aspect of symbolic reasoning systems but they have not been explored in detail for defeasible formalisms such as KLM. We evaluate prior work on the topic with a focus on KLM propositional logic and find that a form of defeasible explanation initially described for Rational Closure which we term weak justification can be adapted to Relevant and Lexicographic Closure as well as described in terms of intuitive properties derived from the KLM postulates. We also consider how a more general definition of defeasible explanation known as strong explanation applies to KLM and propose an algorithm that enumerates these justifications for Rational Closure.}, year = {2022}, journal = {Artificial Intelligence Research. SACAIR 2021.}, edition = {1551}, publisher = {Springer}, address = {Cham}, isbn = {978-3-030-95069-9}, url = {https://link.springer.com/book/10.1007/978-3-030-95070-5}, doi = {10.1007/978-3-030-95070-5_13}, }
We extend the expressivity of classical conditional reasoning by introducing context as a new parameter. The enriched
conditional logic generalises the defeasible conditional setting in the style of Kraus, Lehmann, and Magidor, and allows for a refined semantics that is able to distinguish, for example, between expectations and counterfactuals. In this paper we introduce the language for the enriched logic and define an appropriate semantic framework for it. We analyse which properties generally associated with conditional reasoning are still satisfied by the new semantic framework, provide a suitable representation result, and define an entailment relation based on Lehmann and Magidor’s generally-accepted notion of Rational Closure.
@{430, author = {Giovanni Casini, Tommie Meyer, Ivan Varzinczak}, title = {Contextual Conditional Reasoning}, abstract = {We extend the expressivity of classical conditional reasoning by introducing context as a new parameter. The enriched conditional logic generalises the defeasible conditional setting in the style of Kraus, Lehmann, and Magidor, and allows for a refined semantics that is able to distinguish, for example, between expectations and counterfactuals. In this paper we introduce the language for the enriched logic and define an appropriate semantic framework for it. We analyse which properties generally associated with conditional reasoning are still satisfied by the new semantic framework, provide a suitable representation result, and define an entailment relation based on Lehmann and Magidor’s generally-accepted notion of Rational Closure.}, year = {2021}, journal = {35th AAAI Conference on Artificial Intelligence}, pages = {6254-6261}, month = {02/02/2021-09/02/2021}, publisher = {AAAI Press}, address = {Online}, }
We extend the KLM approach to defeasible reasoning to be applicable to a restricted version of first-order logic. We describe defeasibility for this logic using a set of rationality postulates, provide an appropriate semantics for it, and present a representation result that characterises the semantic description of defeasibility in terms of the rationality postulates. Based on this theoretical core, we then propose a version of defeasible entailment that is inspired by Rational Closure as it is defined for defeasible propositional logic and defeasible description logics. We show that this form of defeasible entailment is rational in the sense that it adheres to our rationality postulates. The work in this paper is the first step towards our ultimate goal of introducing KLM-style defeasible reasoning into the family of Datalog+/- ontology languages.
@{429, author = {Giovanni Casini, Tommie Meyer, Guy Paterson-Jones}, title = {KLM-Style Defeasibility for Restricted First-Order Logic}, abstract = {We extend the KLM approach to defeasible reasoning to be applicable to a restricted version of first-order logic. We describe defeasibility for this logic using a set of rationality postulates, provide an appropriate semantics for it, and present a representation result that characterises the semantic description of defeasibility in terms of the rationality postulates. Based on this theoretical core, we then propose a version of defeasible entailment that is inspired by Rational Closure as it is defined for defeasible propositional logic and defeasible description logics. We show that this form of defeasible entailment is rational in the sense that it adheres to our rationality postulates. The work in this paper is the first step towards our ultimate goal of introducing KLM-style defeasible reasoning into the family of Datalog+/- ontology languages.}, year = {2021}, journal = {19th International Workshop on Non-Monotonic Reasoning}, pages = {184-193}, month = {03/11/2021-05/11/2021}, address = {Online}, url = {https://drive.google.com/open?id=1WSIl3TOrXBhaWhckWN4NLXoD9AVFKp5R}, }
The past 25 years have seen many attempts to introduce defeasible-reasoning capabilities into a description logic setting. Many, if not most, of these attempts are based on preferential extensions of description logics, with a significant number of these, in turn, following the so-called KLM approach to defeasible reasoning initially advocated for propositional logic by Kraus, Lehmann, and Magidor. Each of these attempts has its own aim of investigating particular constructions and variants of the (KLM-style) preferential approach. Here our aim is to provide a comprehensive study of the formal foundations of preferential defeasible reasoning for description logics in the KLM tradition. We start by investigating a notion of defeasible subsumption in the spirit of defeasible conditionals as studied by Kraus, Lehmann, and Magidor in the propositional case. In particular, we consider a natural and intuitive semantics for defeasible subsumption, and we investigate KLM-style syntactic properties for both preferential and rational subsumption. Our contribution includes two representation results linking our semantic
constructions to the set of preferential and rational properties considered. Besides showing that our semantics is appropriate, these results pave the way for more effective decision procedures for defeasible reasoning in description logics. Indeed, we also analyse the problem of non-monotonic reasoning in description logics at the level of entailment and present an algorithm for the computation of rational closure of a defeasible knowledge base. Importantly, our algorithm relies completely on classical entailment and shows that the computational complexity of reasoning over defeasible knowledge bases is no worse than that of reasoning in the underlying classical DL ALC.
@article{433, author = {Katarina Britz, Giovanni Casini, Tommie Meyer, Kody Moodley, Uli Sattler, Ivan Varzinczak}, title = {Principles of KLM-style Defeasible Description Logics}, abstract = {The past 25 years have seen many attempts to introduce defeasible-reasoning capabilities into a description logic setting. Many, if not most, of these attempts are based on preferential extensions of description logics, with a significant number of these, in turn, following the so-called KLM approach to defeasible reasoning initially advocated for propositional logic by Kraus, Lehmann, and Magidor. Each of these attempts has its own aim of investigating particular constructions and variants of the (KLM-style) preferential approach. Here our aim is to provide a comprehensive study of the formal foundations of preferential defeasible reasoning for description logics in the KLM tradition. We start by investigating a notion of defeasible subsumption in the spirit of defeasible conditionals as studied by Kraus, Lehmann, and Magidor in the propositional case. In particular, we consider a natural and intuitive semantics for defeasible subsumption, and we investigate KLM-style syntactic properties for both preferential and rational subsumption. Our contribution includes two representation results linking our semantic constructions to the set of preferential and rational properties considered. Besides showing that our semantics is appropriate, these results pave the way for more effective decision procedures for defeasible reasoning in description logics. Indeed, we also analyse the problem of non-monotonic reasoning in description logics at the level of entailment and present an algorithm for the computation of rational closure of a defeasible knowledge base. Importantly, our algorithm relies completely on classical entailment and shows that the computational complexity of reasoning over defeasible knowledge bases is no worse than that of reasoning in the underlying classical DL ALC.}, year = {2020}, journal = {Transactions on Computational Logic}, volume = {22 (1)}, pages = {1-46}, publisher = {ACM}, url = {https://dl-acm-org.ezproxy.uct.ac.za/doi/abs/10.1145/3420258}, doi = {10.1145/3420258}, }

Latest Research Publications:

Latest Research Publications:

Latest Research Publications:

Latest Research Publications:
Building computational models of agents in dynamic, partially observable and stochastic environments is challenging. We propose a cognitive computational model of sugarcane growers’ daily decision-making to examine sugarcane supply chain complexities. Growers make decisions based on uncertain weather forecasts; cane dryness; unforeseen emergencies; and the mill’s unexpected call for delivery of a different amount of cane. The Belief-Desire-Intention (BDI) architecture has been used to model cognitive agents in many domains, including agriculture. However, typical implementations of this architecture have represented beliefs symbolically, so uncertain beliefs are usually not catered for. Here we show that a BDI architecture, enhanced with a dynamic decision network (DDN), suitably models sugarcane grower agents’ repeated daily decisions. Using two complex scenarios, we demonstrate that the agent selects the appropriate intention, and suggests how the grower should act adaptively and proactively to achieve his goals. In addition, we provide a mapping for using a DDN in a BDI architecture. This architecture can be used for modelling sugarcane grower agents in an agent-based simulation. The mapping of the DDN’s use in the BDI architecture enables this work to be applied to other domains for modelling agents’ repeated decisions in partially observable, stochastic and dynamic environments.
@article{488, author = {C. Sue Price, Deshen Moodley, Anban Pillay, Gavin Rens}, title = {An adaptive probabilistic agent architecture for modelling sugarcane growers’ decision-making}, abstract = {Building computational models of agents in dynamic, partially observable and stochastic environments is challenging. We propose a cognitive computational model of sugarcane growers’ daily decision-making to examine sugarcane supply chain complexities. Growers make decisions based on uncertain weather forecasts; cane dryness; unforeseen emergencies; and the mill’s unexpected call for delivery of a different amount of cane. The Belief-Desire-Intention (BDI) architecture has been used to model cognitive agents in many domains, including agriculture. However, typical implementations of this architecture have represented beliefs symbolically, so uncertain beliefs are usually not catered for. Here we show that a BDI architecture, enhanced with a dynamic decision network (DDN), suitably models sugarcane grower agents’ repeated daily decisions. Using two complex scenarios, we demonstrate that the agent selects the appropriate intention, and suggests how the grower should act adaptively and proactively to achieve his goals. In addition, we provide a mapping for using a DDN in a BDI architecture. This architecture can be used for modelling sugarcane grower agents in an agent-based simulation. The mapping of the DDN’s use in the BDI architecture enables this work to be applied to other domains for modelling agents’ repeated decisions in partially observable, stochastic and dynamic environments.}, year = {2022}, journal = {South African Computer Journal}, volume = {34}, pages = {152-191}, issue = {1}, url = {https://sacj.cs.uct.ac.za/index.php/sacj/article/view/857}, doi = {https://doi.org/10.18489/sacj.v34i1.857}, }
We explore how machine learning (ML) and Bayesian networks (BNs) can be combined in a personal health agent (PHA) for the detection and interpretation of electrocardiogram (ECG) characteristics. We propose a PHA that uses ECG data from wearables to monitor heart activity, and interprets and explains the observed readings. We focus on atrial fibrillation (AF), the commonest type of arrhythmia. The absence of a P-wave in an ECG is the hallmark indication of AF. Four ML models are trained to classify an ECG signal based on the presence or absence of the P-wave: multilayer perceptron (MLP), logistic regression, support vector machine, and random forest. The MLP is the best performing model with an accuracy of 89.61% and an F1 score of 88.68%. A BN representing AF risk factors is developed based on expert knowledge from the literature and evaluated using Pitchforth and Mengersen’s validation framework. The P-wave presence or absence as determined by the ML model is input into the BN. The PHA is evaluated using sample use cases to illustrate how the BN can explain the occurrence of AF using diagnostic reasoning. This gives the most likely AF risk factors for the individual
@inbook{478, author = {Tezira Wanyana, Mbithe Nzomo, C. Sue Price, Deshen Moodley}, title = {Combining Machine Learning and Bayesian Networks for ECG Interpretation and Explanation}, abstract = {We explore how machine learning (ML) and Bayesian networks (BNs) can be combined in a personal health agent (PHA) for the detection and interpretation of electrocardiogram (ECG) characteristics. We propose a PHA that uses ECG data from wearables to monitor heart activity, and interprets and explains the observed readings. We focus on atrial fibrillation (AF), the commonest type of arrhythmia. The absence of a P-wave in an ECG is the hallmark indication of AF. Four ML models are trained to classify an ECG signal based on the presence or absence of the P-wave: multilayer perceptron (MLP), logistic regression, support vector machine, and random forest. The MLP is the best performing model with an accuracy of 89.61% and an F1 score of 88.68%. A BN representing AF risk factors is developed based on expert knowledge from the literature and evaluated using Pitchforth and Mengersen’s validation framework. The P-wave presence or absence as determined by the ML model is input into the BN. The PHA is evaluated using sample use cases to illustrate how the BN can explain the occurrence of AF using diagnostic reasoning. This gives the most likely AF risk factors for the individual}, year = {2022}, journal = {Proceedings of the 8th International Conference on Information and Communication Technologies for Ageing Well and e-Health - ICT4AWE}, pages = {81-92}, publisher = {SciTePress}, address = {INSTICC}, isbn = {978-989-758-566-1}, doi = {https://doi.org/10.5220/0011046100003188}, }
Stock markets are dynamic systems that exhibit complex intra-share and inter-share temporal dependencies. Spatial-temporal graph neural networks (ST-GNN) are emerging DNN architectures that have yielded high performance for flow prediction in dynamic systems with complex spatial and temporal dependencies such as city traffic networks. In this research, we apply three state-of-the-art ST-GNN architectures, i.e. Graph WaveNet, MTGNN and StemGNN, to predict the closing price of shares listed on the Johannesburg Stock Exchange (JSE) and attempt to capture complex inter-share dependencies. The results show that ST-GNN architectures, specifically Graph WaveNet, produce superior performance relative to an LSTM and are potentially capable of capturing complex intra-share and inter-share temporal dependencies in the JSE. We found that Graph WaveNet outperforms the other approaches over short-term and medium-term horizons. This work is one of the first studies to apply these ST-GNNs to share price prediction.
@article{443, author = {Kialan Pillay, Deshen Moodley}, title = {Exploring Graph Neural Networks for Stock Market Prediction on the JSE}, abstract = {Stock markets are dynamic systems that exhibit complex intra-share and inter-share temporal dependencies. Spatial-temporal graph neural networks (ST-GNN) are emerging DNN architectures that have yielded high performance for flow prediction in dynamic systems with complex spatial and temporal dependencies such as city traffic networks. In this research, we apply three state-of-the-art ST-GNN architectures, i.e. Graph WaveNet, MTGNN and StemGNN, to predict the closing price of shares listed on the Johannesburg Stock Exchange (JSE) and attempt to capture complex inter-share dependencies. The results show that ST-GNN architectures, specifically Graph WaveNet, produce superior performance relative to an LSTM and are potentially capable of capturing complex intra-share and inter-share temporal dependencies in the JSE. We found that Graph WaveNet outperforms the other approaches over short-term and medium-term horizons. This work is one of the first studies to apply these ST-GNNs to share price prediction.}, year = {2022}, journal = {Communications in Computer and Information Science}, volume = {1551}, pages = {95-110}, publisher = {Springer}, address = {Cham}, isbn = {978-3-030-95070-5}, url = {https://link.springer.com/chapter/10.1007/978-3-030-95070-5_7}, doi = {10.1007/978-3-030-95070-5_7}, }
This research proposes an architecture and prototype implementation of a knowledge-based system for automating share evaluation and investment decision making on the Johannesburg Stock Exchange (JSE). The knowledge acquired from an analysis of the investment domain for a value investing approach is represented in an ontology. A Bayesian network, developed using the ontology, is used to capture the complex causal relations between different factors that influence the quality and value of individual shares. The system was found to adequately represent the decision-making process of investment professionals and provided superior returns to selected benchmark JSE indices from 2012 to 2018.
@{442, author = {Rachel Drake, Deshen Moodley}, title = {INVEST: Ontology Driven Bayesian Networks for Investment Decision Making on the JSE}, abstract = {This research proposes an architecture and prototype implementation of a knowledge-based system for automating share evaluation and investment decision making on the Johannesburg Stock Exchange (JSE). The knowledge acquired from an analysis of the investment domain for a value investing approach is represented in an ontology. A Bayesian network, developed using the ontology, is used to capture the complex causal relations between different factors that influence the quality and value of individual shares. The system was found to adequately represent the decision-making process of investment professionals and provided superior returns to selected benchmark JSE indices from 2012 to 2018.}, year = {2022}, journal = {Second Southern African Conference for AI Research (SACAIR 2022)}, pages = {252-273}, month = {06/12/2021-10/12/2021}, address = {Online}, isbn = {978-0-620-94410-6}, url = {https://protect-za.mimecast.com/s/OFYSCpgo02fL1l9gtDHUkY}, }
The abductive theory of method (ATOM) was recently proposed to describe the process that scientists use for knowledge discovery. In this paper we propose an agent architecture for knowledge discovery and evolution (KDE) based on ATOM. The agent incorporates a combination of ontologies, rules and Bayesian networks for representing different aspects of its internal knowledge. The agent uses an external AI service to detect unexpected situations from incoming observations. It then uses rules to analyse the current situation and a Bayesian network for finding plausible explanations for unexpected situations. The architecture is evaluated and analysed on a use case application for monitoring daily household electricity consumption patterns.
@inbook{425, author = {Tezira Wanyana, Deshen Moodley}, title = {An Agent Architecture for Knowledge Discovery and Evolution}, abstract = {The abductive theory of method (ATOM) was recently proposed to describe the process that scientists use for knowledge discovery. In this paper we propose an agent architecture for knowledge discovery and evolution (KDE) based on ATOM. The agent incorporates a combination of ontologies, rules and Bayesian networks for representing different aspects of its internal knowledge. The agent uses an external AI service to detect unexpected situations from incoming observations. It then uses rules to analyse the current situation and a Bayesian network for finding plausible explanations for unexpected situations. The architecture is evaluated and analysed on a use case application for monitoring daily household electricity consumption patterns.}, year = {2021}, journal = {KI 2021: Advances in Artificial Intelligence}, edition = {volume 12873}, pages = {241-256}, publisher = {Springer International Publishing}, address = {Cham}, isbn = {978-3-030-87626-5}, doi = {https://doi.org/10.1007/978-3-030-87626-5_18}, }