Leal, Fátima

Loading...
Profile Picture

Email Address

Birth Date

Job Title

Last Name

Leal

First Name

Fátima

Name

Fátima Leal

Biography

Fátima Leal holds a Ph.D. degree in Information and Communication Technologies from University of Vigo, Spain. She is an Auxiliar Professor at Universidade Portucalense in Porto, Portugal and a researcher in REMIT (Research on Economics, Management and Information Technologies). Following a full-time postdoctoral fellowship funded by the European Commission, she continues collaborating with The Cloud Competency Centre at the National College of Ireland in Dublin. Her research is based on crowdsourced information, including, Trust and Reputation, Big Data, Data Streams, and Recommendation Systems. Recently, she has been exploring blockchain technologies for a responsible data processing. Afiliação: REMIT – Research on Economics, Management and Information Technologies. DCT - Departamento de Ciência e Tecnologia.

Research Projects

Organizational Units

Organizational Unit
REMIT – Research on Economics, Management and Information Technologies
Centro de investigação que que tem como objetivo principal produzir e disseminar conhecimento teórico e aplicado que possibilite uma maior compreensão das dinâmicas e tendências económicas, empresariais, territoriais e tecnológicas do mundo contemporâneo e dos seus efeitos socioeconómicos. O REMIT adota uma perspetiva multidisciplinar que integra vários domínios científicos: Economia e Gestão; Ciências e Tecnologia; Turismo, Património e Cultura. Founded in 2017, REMIT – Research on Economics, Management and Information Technologies is a research unit of Portucalense University. Based on a multidisciplinary and interdisciplinary perspective it aims at responding to social challenges through a holistic approach involving a wide range of scientific fields such as Economics, Management, Science, Technology, Tourism, Heritage and Culture. Grounded on the production of advanced scientific knowledge, REMIT has a special focus on its application to the resolution of real issues and challenges, having as strategic orientations: - the understanding of local, national and international environment; - the development of activities oriented to professional practice, namely in the business world.

Search Results

Now showing 1 - 10 of 27
  • PublicationOpen Access
    Crowdsourced data stream mining for tourism recommendation
    2021-04 - Veloso, Bruno; Malheiro, Benedita; Burguillo, Juan C.; Leal, Fátima
    Crowdsourced data streams are continuous flows of data generated at high rate by users, also known as the crowd. These data streams are popular and extremely valuable in several domains. This is the case of tourism, where crowdsourcing platforms rely on tourist and business inputs to provide tailored recommendations to future tourists in real time. The continuous, open and non-curated nature of the crowd-originated data requires robust data stream mining techniques for on-line profiling, recommendation and evaluation. The sought techniques need, not only, to continuously improve profiles and learn models, but also be transparent, overcome biases, prioritise preferences, and master huge data volumes; all in real time. This article surveys the state-of-art in this field, and identifies future research opportunities.
  • PublicationRestricted Access
    Towards a computational approach for the assessment of compliance of ALCOA+ Principles in pharma industry
    2022-05-25 - Durá, Marta; Sánchez-García, Ángel; Sáez, Carlos; Chis, Adriana E.; González-Vélez, Horacio; García-Gómez, Juan M.; Leal, Fátima
    The pharmaceutical industry is a data-intensive environment and a heavily-regulated sector, where exhaustive audits and inspections are performed to ensure the safety of drugs. In this context, processing and evaluating the data generated in the manufacturing lines is a relevant challenge since it requires compliance with pharma regulations. This work combines data integrity metrics and blockchain technology to evaluate the compliance-degree of ALCOA+ principles among different levels of drug manufacturing data. We propose the DIALCOA tool, a software to assess the compliance-degree for each ALCOA+ principle, based on the assessment of data from manufacturing batch reports and its different levels of information.
  • PublicationRestricted Access
    Explanation plug-In for stream-based collaborative filtering
    2022-05-11 - García-Méndez, Silvia; Malheiro, Benedita; Burguillo, Juan C.; Leal, Fátima
    Collaborative filtering is a widely used recommendation technique, which often relies on rating information shared by users, i.e., crowdsourced data. These filters rely on predictive algorithms, such as, memory or model based predictors, to build direct or latent user and item profiles from crowdsourced data. To predict unknown ratings, memory-based approaches rely on the similarity between users or items, whereas model-based mechanisms explore user and item latent profiles. However, many of these filters are opaque by design, leaving users with unexplained recommendations. To overcome this drawback, this paper introduces Explug, a local model-agnostic plug-in that works alongside stream-based collaborative filters to reorder and explain recommendations. The explanations are based on incremental user Trust & Reputation profiling and co-rater relationships. Experiments performed with crowdsourced data from TripAdvisor show that Explug explains and improves the quality of stream-based collaborative filter recommendations.
  • PublicationOpen Access
    Blockchain for data originality in Pharma Manufacturing
    2023-07-07 - Durá, Marta; Sánchez-García, Ángel; Sáez, Carlos; García-Gómez, Juan M.; Chis, Adriana E.; González-Vélez, Horacio; Leal, Fátima
    This paper analyses the feasibility of tracking data originality for pharmaceutical manufacturing in a tamper-proof manner using a geographically distributed system. The main research question is whether it is possible to ensure the traceability of drug manufacturing through the use of smart contracts and a private blockchain network.
  • PublicationRestricted Access
    Responsible processing of crowdsourced tourism data
    2020-07-13 - Malheiro, Benedita; Veloso, Bruno; Burguillo, Juan Carlos; Leal, Fátima
    Online tourism crowdsourcing platforms, such as AirBnB, Expedia or TripAdvisor, rely on the continuous data sharing by tourists and businesses to provide free or paid value-added services. When adequately processed, these data streams can be used to explain and support businesses in the early identification of trends as well as prospective tourists in obtaining tailored recommendations, increasing the confidence in the platform and empowering further end-users. However, existing platforms still do not embrace the desired accountability, responsibility and transparency (ART) design principles, underlying to the concept of sustainable tourism. The objective of this work is to study this problem, identify the most promising techniques which follow these principles and design a novel ART-compliant processing pipeline. To this end, this work surveys: (i) real-time data stream mining techniques for recommendation and trend identification; (ii) trust and reputation (T&R) modelling of data contributors; (iii) chained-based storage of trust models as smart contracts for traceability and authenticity; and (iv) trust- and reputation-based explanations for a transparent and satisfying user experience. The proposed pipeline redesign has implications both to digital and to sustainable tourism since it advances the current processing of tourism crowdsourcing platforms and impacts on the three pillars of sustainable tourism.
  • PublicationOpen Access
    Interpretable success prediction in higher education institutions using pedagogical surveys
    2022-10-18 - Veloso, Bruno; Leal, Fátima; Moreira, Fernando; Santos-Pereira, Carla; Jesus-Silva, Natacha; Durão, Natércia
    The indicators of student success at higher education institutions are continuously analysed to increase the students’ enrolment in multiple scientific areas. Every semester, the students respond to a pedagogical survey that aims to collect the student opinion of curricular units in terms of content and teaching methodologies. Using this information, we intend to anticipate the success in higher- level courses and prevent dropouts. Specifically, this paper contributes with an interpretable student classification method. The proposed solution relies on (i) a pedagogical survey to collect student’s opinions; (ii) a statistical data analysis to validate the reliability of the survey; and (iii) machine learning algorithms to classify the success of a student. In addition, the proposed method includes an explainable mechanism to interpret the classifications and their main factors. This transparent pipeline was designed to have implications in both digital and sustainable education, impacting the three pillars of sustainability, i.e.,economic, social, and environmental, where transparency is a cornerstone. The work was assessed with a dataset from a Portuguese higher-level institution, contemplating multiple courses from different departments. The most promising results were achieved with Random Forest presenting 98% in accuracy and F-measure.
  • PublicationOpen Access
    Simulation, modelling and classification of wiki contributors: Spotting the good, the bad, and the ugly
    2022-06 - García-Méndez, Silvia; Malheiro, Benedita; Burguillo-Rial, Juan Carlos; Veloso, Bruno; Chis, Adriana E.; González-Vélez, Horacio; Leal, Fátima
    Data crowdsourcing is a data acquisition process where groups of voluntary contributors feed platforms with highly relevant data ranging from news, comments, and media to knowledge and classifications. It typically processes user-generated data streams to provide and refine popular services such as wikis, collaborative maps, e-commerce sites, and social networks. Nevertheless, this modus operandi raises severe concerns regarding ill-intentioned data manipulation in adversarial environments. This paper presents a simulation, modelling, and classification approach to automatically identify human and non-human (bots) as well as benign and malign contributors by using data fabrication to balance classes within experimental data sets, data stream modelling to build and update contributor profiles and, finally, autonomic data stream classification. By employing WikiVoyage – a free worldwide wiki travel guide open to contribution from the general public – as a testbed, our approach proves to significantly boost the confidence and quality of the classifier by using a class-balanced data stream, comprising both real and synthetic data. Our empirical results show that the proposed method distinguishes between benign and malign bots as well as human contributors with a classification accuracy of up to 92 %.
  • PublicationRestricted Access
    Interpretable classification of Wiki-review streams
    2023-12-13 - García-Méndez, Silvia; Malheiro, Benedita; Burguillo-Rial, Juan Carlos; Leal, Fátima
    Wiki articles are created and maintained by a crowd of editors, producing a continuous stream of reviews. Reviews can take the form of additions, reverts, or both. This crowdsourcing model is exposed to manipulation since neither reviews nor editors are automatically screened and purged. To protect articles against vandalism or damage, the stream of reviews can be mined to classify reviews and profile editors in real-time. The goal of this work is to anticipate and explain which reviews to revert. This way, editors are informed why their edits will be reverted. The proposed method employs stream-based processing, updating the profiling and classification models on each incoming event. The profiling uses side and content-based features employing Natural Language Processing, and editor profiles are incrementally updated based on their reviews. Since the proposed method relies on self-explainable classification algorithms, it is possible to understand why a review has been classified as a revert or a non-revert. In addition, this work contributes an algorithm for generating synthetic data for class balancing, making the final classification fairer. The proposed online method was tested with a real data set from Wikivoyage, which was balanced through the aforementioned synthetic data generation. The results attained near-90% values for all evaluation metrics (accuracy, precision, recall, and F -measure).
  • PublicationRestricted Access
    Reliable and transparent Supply Chain supported by Blockchain
    2024-10-29 - Leal, Fátima; Moreira, Fernando
    With the increasing complexity and fragmentation of supply chains, maintaining reliability and transparency has become a significant challenge. Blockchain offers a decentralized and transparent mechanism for recording and verifying transactions. In supply chain context, blockchain can create an immutable and secure ledger that traces a product cycle, i.e., from its origin to the end consumer. Each transaction, such as the transfer of ownership, location, and quality control, is recorded in a block, forming a chain of information that cannot be altered or tampered. One of the key advantages of implementing blockchain is the transparency which reduces the risk of fraud, counterfeiting, and unethical practices. Furthermore, blockchain technology improves the reliability and efficiency of supply chain operations. By automating and streamlining manual processes, such as paperwork, record-keeping, and verification, blockchain reduces human errors and delays. Smart contracts, self-executing agreements coded on the blockchain, enable automatic triggers for actions like payments and order fulfillment, eliminating the middleman and reducing transaction costs. This streamlined process not only saves time and resources but also ensures accurate and timely delivery of goods, enhancing customer satisfaction. Embracing blockchain as a foundational technology in supply chain management can lead to a more secure, efficient, and sustainable global trade environment.
  • PublicationRestricted Access
    Trust and reputation smart contracts for explainable recommendations
    2020-05-18 - Veloso, Bruno; Malheiro, Benedita; González-Vélez, Horacio; Leal, Fátima
    Recommendation systems are usually evaluated through accuracy and classification metrics. However, when these systems are supported by crowdsourced data, such metrics are unable to estimate data authenticity, leading to potential unreliability. Consequently, it is essential to ensure data authenticity and processing transparency in large crowdsourced recommendation systems. In this work, processing transparency is achieved by explaining recommendations and data authenticity is ensured via blockchain smart contracts. The proposed method models the pairwise trust and system-wide reputation of crowd contributors; stores the contributor models as smart contracts in a private Ethereum network; and implements a recommendation and explanation engine based on the stored contributor trust and reputation smart contracts. In terms of contributions, this paper explores trust and reputation smart contracts for explainable recommendations. The experiments, which were performed with a crowdsourced data set from Expedia, showed that the proposed method provides cost-free processing transparency and data authenticity at the cost of latency.