Title: An emotion-aware search engine for multimedia content based on deep learning algorithms

Authors: Andrea Chiorrini; Claudia Diamantini; Alex Mircoli; Domenico Potena; Emanuele Storti

Addresses: Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy ' Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy ' Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy ' Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy ' Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy

Abstract: Nowadays, large amounts of unstructured data are available online. Such data often contain users' emotions and feelings about a variety of topics but their retrieval and selection on the basis of an emotional perspective are usually unfeasible through traditional search engines, which only rank web content according to its relevance with respect to a given search keyword. For this reason, in the present work we introduce the architecture of a novel emotion-aware search engine that can return search results ranked on the basis of seven human emotions. Using this system, users can benefit from a more advanced semantic search that also takes into account emotions. The system uses emotion recognition algorithms based on deep learning to extract emotion vectors from texts, images and videos and then populates an emotional index to allow users to visualise results related to given emotions. We also discuss and evaluate different deep learning models for building emotional indexes from texts, images and videos.

Keywords: emotion recognition; query answering; emotion-aware query answering; multimedial query answering; sentiment analysis; emotion analysis; emotion-aware search engine; deep learning; BERT; multimodal analysis.

DOI: 10.1504/IJCAT.2023.134757

International Journal of Computer Applications in Technology, 2023 Vol.73 No.2, pp.130 - 139

Received: 05 Aug 2022
Received in revised form: 25 Mar 2023
Accepted: 28 Mar 2023

Published online: 09 Nov 2023 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article