Hugendubel.info - Die B2B Online-Buchhandlung 

Merkliste
Die Merkliste ist leer.
Bitte warten - die Druckansicht der Seite wird vorbereitet.
Der Druckdialog öffnet sich, sobald die Seite vollständig geladen wurde.
Sollte die Druckvorschau unvollständig sein, bitte schliessen und "Erneut drucken" wählen.

Evaluating Systems for Multilingual and Multimodal Information Access

E-BookPDF1 - PDF WatermarkE-Book
1002 Seiten
Englisch
Springer Berlin Heidelbergerschienen am29.09.20092009
The ninth campaign of the Cross-Language Evaluation Forum (CLEF) for European languages was held from January to September 2008. There were seven main eval- tion tracks in CLEF 2008 plus two pilot tasks. The aim, as usual, was to test the p- formance of a wide range of multilingual information access (MLIA) systems or s- tem components. This year, 100 groups, mainly but not only from academia, parti- pated in the campaign. Most of the groups were from Europe but there was also a good contingent from North America and Asia plus a few participants from South America and Africa. Full details regarding the design of the tracks, the methodologies used for evaluation, and the results obtained by the participants can be found in the different sections of these proceedings. The results of the CLEF 2008 campaign were presented at a two-and-a-half day workshop held in Aarhus, Denmark, September 17-19, and attended by 150 resear- ers and system developers. The annual workshop, held in conjunction with the European Conference on Digital Libraries, plays an important role by providing the opportunity for all the groups that have participated in the evaluation campaign to get together comparing approaches and exchanging ideas. The schedule of the workshop was divided between plenary track overviews, and parallel, poster and breakout sessions presenting this year's experiments and discu- ing ideas for the future. There were several invited talks.mehr
Verfügbare Formate
E-BookPDF1 - PDF WatermarkE-Book
EUR149,79
Book on DemandKartoniert, Paperback
EUR160,49

Produkt

KlappentextThe ninth campaign of the Cross-Language Evaluation Forum (CLEF) for European languages was held from January to September 2008. There were seven main eval- tion tracks in CLEF 2008 plus two pilot tasks. The aim, as usual, was to test the p- formance of a wide range of multilingual information access (MLIA) systems or s- tem components. This year, 100 groups, mainly but not only from academia, parti- pated in the campaign. Most of the groups were from Europe but there was also a good contingent from North America and Asia plus a few participants from South America and Africa. Full details regarding the design of the tracks, the methodologies used for evaluation, and the results obtained by the participants can be found in the different sections of these proceedings. The results of the CLEF 2008 campaign were presented at a two-and-a-half day workshop held in Aarhus, Denmark, September 17-19, and attended by 150 resear- ers and system developers. The annual workshop, held in conjunction with the European Conference on Digital Libraries, plays an important role by providing the opportunity for all the groups that have participated in the evaluation campaign to get together comparing approaches and exchanging ideas. The schedule of the workshop was divided between plenary track overviews, and parallel, poster and breakout sessions presenting this year's experiments and discu- ing ideas for the future. There were several invited talks.
Details
Weitere ISBN/GTIN9783642044472
ProduktartE-Book
EinbandartE-Book
FormatPDF
Format Hinweis1 - PDF Watermark
FormatE107
Erscheinungsjahr2009
Erscheinungsdatum29.09.2009
Auflage2009
Reihen-Nr.5706
Seiten1002 Seiten
SpracheEnglisch
IllustrationenXXIV, 1002 p.
Artikel-Nr.8818928
Rubriken
Genre9200

Inhalt/Kritik

Inhaltsverzeichnis
What Happened in CLEF 2008.- I: Multilingual Textual Document Retrieval (Ad Hoc).- TEL@CLEF.- Persian@CLEF.- Robust-WSD.- Ad Hoc Mixed: TEL and Persian.- II: Mono- and Cross-Language Scientific Data Retrieval (Domain-Specific).- III: Interactive Cross-Language Retrieval (iCLEF).- IV: Multiple Language Question Answering (QA@CLEF).- Mono and Bilingual QA.- Answer Validation Exercise (AVE).- Question Answering on Script Transcription (QAST).- V: Cross-Language Retrieval in Image Collections (ImageCLEF).- ImageCLEFphoto.- ImageCLEFmed.- ImageCLEFWiki.- VI: Multilingual Web Track (WebCLEF).- VII: Cross-Language Geographical Retrieval (GeoCLEF).- VIII: Cross-Language Video Retrieval (VideoCLEF).- IX: Multilingual Information Filtering (INFILE@CLEF).- X: Morpho Challenge at CLEF 2008.mehr

Autor