This section only summarises the more complete SLT Evaluation Plan document to be found here.
TC-STAR Evaluation Run #2 for SLT will take place from November 18, 2005 to March 15, 2006. The complete schedule can be seen here, but we can outline the important dates for SLT:
Before the proper evaluation run, participants have access to training data and development data composed of parallel texts and transcriptions. See below training data and development data.
SLT evaluation will be run in 3 translation directions: English to Spanish, Spanish to English and Chinese Mandarin to English.
For each translation direction, three kinds of text data were used as input:
An example of the three kinds of inputs is shown below:
Text |
---|
I am starting to know what Frank Sinatra must have felt like, |
Verbatim |
I'm I'm I'm starting to know what Frank Sinatra must have felt like |
ASR output |
and i'm times and starting to know what frank sinatra must have felt like |
English to Spanish and Spanish to English are run on recording transcriptions from the European Parliament Plenary Sessions (EPPS), while Chinese to English is run from recording transcriptions of Voice of America.
For the participants, a submission guideline is available.
Direction | Input | Participants |
Zh-->En (VoA) | Single-best ASR | IBM?, IRST, RWTH, UKA |
Verbatim | IBM?, IRST, RWTH, UKA | |
Es-->En (EPPS + PARL) | Text | IBM, IRST, RWTH, UKA, UPC |
Single-best ASR | IBM, IRST, LIMSI, RWTH, UKA, UPC | |
Verbatim | IBM, IRST, LIMSI, RWTH, UKA, UPC | |
En-->Es (EPPS) | Text | IBM, IRST, RWTH, UKA, UPC |
Single-best ASR | IBM, IRST, RWTH, UKA, UPC | |
Verbatim | IBM, IRST, RWTH, UKA, UPC |
There is no text condition for Mandarin.
IBM: International Business Machines, Germany |
IRST:
Il Centro per la Ricerca Scientifica e Tecnologica, Italy |
LIMSI:
Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur, France |
RWTH:
Rheinisch-Westfaelische Technische Hochschule, Germany |
UKA:
Universitaet Karlsruhe, Germany |
UPC:
Universitat Politècnica de Catalunya, Spain |
External Participants
Direction | Input | Participants |
Zh-->En (VoA) | Single-best ASR | ICT, NLPR |
Verbatim | ICT, NRC, NLPR | |
Es-->En (EPPS + PARL) | Text | SYSTRAN, UED, UPV, UW |
Single-best ASR | UED | |
Verbatim | UED, UW | |
En-->Es (EPPS) | Text | DFKI, SYSTRAN, UED, UPV, UW |
Single-best ASR | DFKI, UED | |
Verbatim | DFKI, UED, UW |
DFKI: Deutches
Forschungszentrum für Künstliche Intelligenz, Germany |
ICT: Institute of Computing Technology, China |
NLPR: National Laboratory of Pattern Recognition, China |
NRC: National Research Council, Canada |
SYSTRAN : System Language Translation Technologies |
UED: University of Edinburgh |
UPV:
Universitat Politècnica de Valencia, Spain |
UW: University of Washington, United States |
John Hopkins University and Google have not given their commitments yet.
Training data are the same than for run #1.
Direction | Description | Reference | Amount | IPR-owner | IPR-distrib | Comment |
Training | ||||||
Zh->En | FBIS Multilanguage Texts | LDC2003E14 | LDC | research | LDC membership 03 required | |
UN Chinese English Parallel Text Version 2 | LDC2004E12 | LDC | research | LDC membership 04 required | ||
Hong Kong Parallel Text | LDC2004T08 | LDC | research | LDC membership 04 required | ||
English Translation of Chinese Treebank | LDC2002E17 | LDC | research | LDC membership 02 required | ||
Xinhua Chinese-English Parallel News Text Version 1.0 beta 2 | LDC2002E18 | LDC | research | LDC membership 02 required | ||
Chinese English Translation Lexicon version 3.0 | LDC2002L27 | LDC | research | LDC membership 02 required | ||
Chinese-English Name Entity Lists version 1.0 beta | LDC2003E01 | LDC | research | LDC membership 03 required | ||
Chinese English News Magazine Parallel Text | LDC2005E47 | LDC | research | LDC membership 05 required | ||
Multiple-Translation Chinese (MTC) Corpus | LDC2002T01 | LDC | research | LDC membership 02 required | ||
Multiple Translation Chinese (MTC) Part 2 | LDC2003T17 | LDC | research | LDC membership 03 required | ||
Multiple Translation Chinese (MTC) Part 3 | LDC2004T07 | LDC | research | LDC membership 04 required | ||
Chinese News Translation Text Part 1 | LDC2005T06 | LDC | research | LDC membership 05 required | ||
Chinese Treebank 5.0 | LDC2005T01 | LDC | research | LDC membership 05 required | ||
Chinese Treebank English Parallel Corpus | LDC2003E07 | LDC | research | LDC membership 03 required | ||
Es->En | EPPS Spanish verbatim transcriptions May - Jan 2005 | 100h transcribed | UPC | ELRA | Transcribed by UPC | |
EPPS Spanish final text edition May 2004- Jan 2005 | EC | ELRA | English and Spanish parallel texts are aligned. Verbatim transcriptions are also aligned with FTE by RWTH. |
|||
EPPS Spanish final text edition April 1996 to Jan 2005 | EC | RWTH | Provided to TCSTAR by RWTH | |||
En->Es | EPPS English verbatim transcriptions May 2004- Jan 2005 | 100h transcribed | RWTH | ELRA | Transcribed by RWTH | |
EPPS English final text edition May 2004- Jan 2005 | EC | ELRA | English and Spanish parallel texts are aligned. Verbatim transcriptions are also aligned with FTE by RWTH. |
|||
EPPS English final text edition April 1996 to Jan 2005 | EC | RWTH | Provided to TCSTAR by RWTH |
English and Spanish
You can use any of the training resources listed in the table above in addition to the EPPS training sets. To get these last sets on DVD, please contact Christian Gollan at RWTH.
Chinese
You can use any of the training resources listed in the table above excepted TDT3 audio files and transcriptions for the month of December 1998 (development and test sets will be built from these files).
Verbatim transcriptions of EPPS are common with ASR evaluation. The difference is that only 2 files are used for SLT (instead of 3 for ASR) as only 25,000 words are needed.
You can also find development data of the 2005 SLT evaluation on the SLT Run #1 page.
Direction | Input | Files |
---|---|---|
Es-->En (cortes) | Final Text Edition |
|
Verbatim |
(New version of the data updated on the 28th August 2006)
|
|
ASR |
(New version of the data updated on the 15th February 2006)
|
|
En-->Es (EPPS) | Final Text Edition |
|
Verbatim | (New version of the data updated on the 25th January 2005)
|
|
ASR |
|
|
Individual system output of the data (New version of the data updated on the 15th February 2006):
|
||
Es-->En (EPPS) | Final Text Edition | (New version of the data updated on the first December 2005)
|
Verbatim | (New version of the data updated on the 19th January 2005)
|
|
ASR |
|
|
Individual system output of the data (New version of the data updated on the 15th February 2006):
|
||
Zh-->En (VoA) | Verbatim | (New version of the data updated on the 23rd January 2006)
|
ASR |
|
|
(New version of the data updated on the 23rd January 2006)
|
The translation guidelines for the translation agencies are available here (MS Word document).
Validation report from SPEX for ENES development files (FTE & Verbatim)
Validation report from SPEX for ESEN development files (FTE & Verbatim)
Word graphs/lattices are regularly posted on WP2's own web page.
You can find on the LIMSI web page the single best results dev06 for English and Spanish.
(updated on January 10)
The scoring tools proposed by ELDA are Perl scripts. You can download them in this zip containing:
In this package, we propose sample files to check proper installation:
For the ASR task the alignment tool from RWTH is available here.
Submissions will have to be sent by email to hamon@elda.org before Wednesday March 1st.
Submitted files should use the NIST MT format for TSTSET:
<TSTSET SetID="..." SrcLang="..." TrgLang="...">
<DOC DocID="..." SysID="...">
<SEG id="1">
TRANSLATED TEXT
</SEG>
...
</DOC>
...
</TSTSET>
Output file and source file formats are the same with the following exceptions:
Recommendations:
Submit one test set per file, i.e. one file for English verbatim transcripts, one file for English ASR output, one file for English FTE, etc.
The SysID attribute must identify the organisation, the condition, and the system. For instance, if the organisation ORG submits one primary condition and two secondary conditions for the English verbatim transcripts (one with the same system as for the primary condition and another with a different system or system version), then 3 files will be sent for this setid, with the following SysID:
In the same manner, output files must identify the organisation, with the same constraints. For instance, if the organisation ORG translates the file "TC-STAR_RUN2_TEST06_EPPS_FTE_ENES_SRC.TXT", the translated file should be renamed "TC-STAR_RUN2_TEST06_EPPS_FTE_ENES_ORG-PRIMARY-system1.TXT".
(the use of "system1" can be omitted if there is only one system by condition)
About the ASR task, the SetID attribute should be:
Systems description:
For each experiment, a one page system description must be provided describing the data used, the approaches (algorithms), the configuration, the processing time, etc. The document should also contain references. The file should be named as "<SysID>.txt" .
Submission:
Submissions must be sent by email at the following address: hamon@elda.org
with the subject: "[TC-STAR] Submission <SysID>"
and with the archived files in attachment.
The deadline is Wednesday 1st of March, 23h59 CET. (5h59 pm for Pittsburgh and Yorktown)
A return receipt will be sent within 24 hours.
Source files
Direction | Input | Files |
---|---|---|
En-->Es (EPPS) | Final Text Edition |
|
Verbatim | English source set (NIST MT format) |
|
ASR |
|
|
Es-->En (EPPS + PARL) | Final Text Edition | Spanish source set (NIST MT format) |
Verbatim | Spanish source set (NIST MT format) |
|
ASR |
|
|
Zh-->En (VoA) | Verbatim | Chinese source set (NIST MT format, GB2312 encoded) |
ASR |
|
Reference files
Direction | Input | Files |
---|---|---|
En-->Es (EPPS) | Final Text Edition |
|
Verbatim | Spanish reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) |
|
ASR |
|
|
Es-->En (EPPS + PARL) | Final Text Edition | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) - updated on the 31st March 2006 |
Verbatim | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) - updated on the 28 August 2006 |
|
ASR |
|
|
EPPS | Final Text Edition | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) |
Verbatim | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) |
|
ASR |
|
|
PARL | Final Text Edition | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) |
Verbatim | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) - updated on the 28 August 2006 |
|
ASR |
|
|
Zh-->En (VoA) | Verbatim | English reference translations set (NIST MT format, utf-8 encoded, 2 reference translations) |
ASR |
|
Preliminary results are available: