Gregor Koporec (Author), Janez Perš (Author)

Abstract

Despite their powerful discriminative abilities, Convolutional Neural Networks (CNNs) lack the properties of generative models. This leads to a decreased performance in environments where objects are poorly visible. Solving such a problem by adding more training samples can quickly lead to a combinatorial explosion, therefore the underlying architecture has to be changed instead. This work proposes a Human-Centered Deep Compositional model (HCDC) that combines low-level visual discrimination of a CNN and the high-level reasoning of a Hierarchical Compositional model (HCM). Defined as a transparent model, it can be optimized to real-world environments by adding compactly encoded domain knowledge from human studies and physical laws. The new FridgeNetv2 dataset and a mixture of publicly available datasets are used as a benchmark. The experimental results show the proposed model is explainable, has higher discriminative and generative power, and better handles the occlusion than the current state-of-the-art Mask-RCNN in instance segmentation tasks.

Keywords

računalniški vid;globoko učenje;konvolucijske nevronske mreže;hierarhični kompozicionalni model;zakrivanje;interpretabilnost;poznavanje področja;computer vision;deep learning;convolutional neural networks;hierarchical compositonal model;occlusion;discriminability;generalizability;interpretability;domain knowledge;

Data

Language: English
Year of publishing:
Typology: 1.01 - Original Scientific Article
Organization: UL FE - Faculty of Electrical Engineering
UDC: 004
COBISS: 142438403 Link will open in a new window
ISSN: 0031-3203
Views: 18
Downloads: 5
Average score: 0 (0 votes)
Metadata: JSON JSON-RDF JSON-LD TURTLE N-TRIPLES XML RDFA MICRODATA DC-XML DC-RDF RDF

Other data

Secondary language: Slovenian
Secondary keywords: računalniški vid;globoko učenje;konvolucijske nevronske mreže;hierarhični kompozicionalni model;zakrivanje;interpretabilnost;poznavanje področja;
Type (COBISS): Article
Pages: str. 1-14
Issue: ǂVol. ǂ138, [article no.] ǂ109397
Chronology: 2023
DOI: 10.1016/j.patcog.2023.109397
ID: 19861989