Akbacak E.Toktas A.Erkan U.Gao S.2024-01-222024-01-22202409507051https://doi.org/10.1016/j.knosys.2023.111193https://hdl.handle.net/11492/8062Image retrieval (IR) methods extract the most relevant images to the query images from an image database. The existing IR methods, which retrieve images with a low degree of similarity, use computationally expensive approaches. In this study, a novel Multi-Label Multi-Query IR (MLMQ-IR) method based on the variance of Hamming distance is presented for the query of multiple images having multiple labels. The MLMQ-IR uses deep learning-based hashing code generation with ResNet50 structure. The variance evaluates the minimum variation of the distances between the query and database images, providing the trade-off images according to the center of Pareto space. Moreover, the MLMQ-IR exploits a new Triple Loss Multi-Label Hashing (TLMH) depending on binary cross-entropy loss and bit-balance loss functions. The MLMQ-IR is compared with recent multi-label and multi-query methods through MIRFLICKR-25 K, MS-COCO and NUS-WIDE datasets in terms of three well-known metrics, and the methods are ranked with succussed sorting. As a result, the MLMQ-IR method has the best avg. mean rank with 1.86. The results manifest that the MLMQ-IR provides the most similar retrieved images to the query images owing to utilizing the variance which is the efficient and fast IR approach. © 2023 Elsevier B.V.enCNNDeep learningHash codeImage retrievalMulti-label image retrievalMulti-query image retrievalVarianceEconomic and social effectsHamming distanceHash functionsImage retrievalQuery processingDeep learningHash codeImage databaseLabel imagesMulti-label image retrievalMulti-labelsMulti-query image retrievalQuery imagesRetrieval methodsVarianceDeep learningMLMQ-IR: Multi-label multi-query image retrieval based on the variance of Hamming distanceArticle283info:eu-repo/semantics/closedAccess2-s2.0-85177215352WOS:00116568450000110.1016/j.knosys.2023.111193Q1Q1