Connectionist models for sentence-based text extracts
Authors: Triantafyllou, Ioannis 
Demiros, Iason 
Antonopoulos, Vassilios 
Georgantopoulos, Byron 
Piperidis, Stelios 
Publisher: IEEE
Issue Date: 1-Jan-2001
Conference: IEEE International Conference on Systems, Man and Cybernetics 
Book: IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace 
Volume: 4
Abstract: 
This paper addresses the problem of creating a summary by extracting a set of sentences that are likely to represent the content of a document. A small scale experiment is conducted leading to the compilation of an evaluation corpus for the Greek language. Two models of sentence extraction are then described, along the lines of shallow linguistic analysis, feature combination and machine learning. Both models are based on term extraction and statistical filtering. After extracting the individual features of the text, we apply them to two neural networks that classify each sentence depending on its feature vector, the term weight being the feature with the best discriminant capacity. A three-layer feedforward network trained with the highly popular backpropagation algorithm and a competitive learning self-organizing map characterized by the formation of a topographic map, both trained on a small manually annotated corpus of summaries, perform the sentence extraction task. Both methods could be used for rapid light information retrieval-oriented summarization.
ISBN: 0-7803-7087-2
ISSN: 1062-922X
DOI: 10.1109/ICSMC.2001.972964
URI: https://uniwacris.uniwa.gr/handle/3000/458
Type: Conference Paper
Department: Department of Archival, Library and Information Studies 
School: School of Administrative, Economics and Social Sciences 
Affiliation: University of West Attica (UNIWA) 
Appears in Collections:Book Chapter / Κεφάλαιο Βιβλίου

CORE Recommender
Show full item record

Page view(s)

33
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.