Monday, September 11, 2000 |
||
ICGI |
||
8.30 |
|
|
9.30 |
Welcome |
GA |
ICGI |
R 01.1 |
||
10.00 |
Efficient ambiguity detection in C-NFA, a step towards the inference of non deterministic automataFrancois Coste, Daniel Fredouille |
||
10.30 |
Learning regular languages using non deterministic finite automataFrançois Denis , Aurélien Lemay , Alain Terlutte |
||
11:00 |
Break |
||
|
ICGI |
R 01.1 |
|
11:15 |
Identification in the limit with probability one of stochastic deterministic finite automata Colin de la Higuera , Franck Thollard |
||
11:45 |
Combination of estimation algorithms and grammatical inference techniques to learn stochastic context-free grammars Francisco Nevado , Joan-Andreu Sánchez , José-Miguel Benedí |
||
12:15 |
Computation of substring probabilities in stochastic grammars
|
||
12:45 |
Lunch |
Holiday Inn |
|
ICGI |
R01.1 |
|
14:00 |
A comparative study of two algorithms for automata identificationP. Garcia , A. Cano , J. Ruiz |
||
14:30 |
Inference of finite-state transducers by using regular grammars and morphisms Francisco Casacuberta |
||
15:00 |
Improve the learning of subsequential transducers by using alignments and dictionariesJuan Miguel Vilar |
||
15:30 |
Break |
||
ICGI |
R01.1 |
||
16:00 |
An inverse limit of context-free grammars - a new approach to identifiability in the limit Pavel Martinek |
||
16:30 |
Synthesizing context free grammars from sample strings based on inductive CYK algorithm
|
||
17:00 |
Constructive learning of context-free languages with a subpansive tree
|
||
17:45 |
Business meeting (open to all participants) |
|
||
|
|
|
9:00 |
Iterated transductions and efficient learning from positive data: a unifying view Satoshi Kobayashi |
|
9:30 |
Smoothing probabilistic automata: an error-correcting approach
|
|
10:00 |
The induction of temporal grammatical rules from multivariate time series
|
|
10:30 |
Break |
|
ICGI |
R 01.1 |
|
11:00 |
A polynomial time learning algorithm of simple deterministic languages via membership queries and a representative sample
|
|
11:30 |
Identification of tree translation rules from examplesHiroshi Sakamoto , Hiroki Arimura , Setsuo Arikawa |
|
12:00 |
On the relationship between models for learning in helpful environmentsRajesh Parekh , Vasant Honavar |
|
12:30 |
Lunch |
Holiday Inn |
ICGI |
R01.1 |
|
14:00 |
Inferring subclasses of contextual languagesJ. D. Emerald , K. G. Subramanian , D. G. Thomas |
|
14:30 |
Learning context-free grammars from partially structured examplesYasubumi Sakakibara , Hidenori Muramatsu |
|
15:00 |
Permutations and control sets for learning non-regular language familiesHenning Fernau , José M. Sempere |
|
16:00 |
Social Program |
Wednesday September 13, 2000 |
||||
ICGI/LLL/CoNLL |
GA |
|||
9.00 |
Opening |
|||
9.15 |
On the Complexity of Consistent Identification of Some Classes of Structure Language Christophe Costa Florêncio |
|||
9.45 |
The Acquisition of Word Order by a Computational Learning System Aline Villavicencio |
|||
10.15 |
Corpus-Based Grammar Specialization Nicola Cancedda, Christer Samuelsson |
|||
10.45 |
Break |
|||
|
CoNLL |
R 02.3 |
ICGI |
R 01.1 |
11.30 |
Memory-Based Learning for Article Generation Guido Minnen, Francis Bond, Ann Copestake |
Computational Complexity of Problems on Probabilistic Grammars and Transducers Francisco Casacuberta, Colin de la Higuera |
||
12.00 |
Using Induced Rules as Complex Features in Memory-Based Language Learning Antal van den Bosch |
Probabilistic k-Testable Tree Languages Juan Ramón Rico Juan, Jorge Calera Rubio, Rafael C. Carrasco |
||
12.30 |
Pronunciation by Analogy in Normal and Impaired Readers R.I. Damper, Y. Marchand |
Counting Extensional Differences in BC-Learning Frank Stephan, Sebastiaan A. Terwijn |
||
13.00 |
Lunch |
Holiday Inn |
|
CoNLL |
R 02.3 |
|
14.00 |
Learning Distributed Linguistic Classes Stephan Raaijmakers |
||
14.30 |
Knowledge-Free Induction of Morphology Using Latent Semantic Analysis Patrick Schone, Daniel Jurafsky |
||
15.00 |
Modeling the Effect of Cross-Language Ambiguity on Human Syntax Acquisition William Gregory Sakas |
||
15.30 |
Break |
||
CoNLL |
R 02.3 |
||
16.00 |
CoNLL Poster Session Using Perfect Sampling in Parameter Estimation of a Whole Sentence Maximum Entropy Language Model , F. Amaya, J.M. BenedíExperiments on Unsupervised Learning for Extracting Relevant Fragments from Spoken Dialog Corpus , Konstantin BiatovGenerating Synthetic Speech Prosody with Lazy Learning in Tree Structures , Laurent Blin, Laurent MicletInducing Syntactic Categories by Context Distribution Clustering , Alexander ClarkALLiS: a Symbolic Learning System for Natural Language Learning , Hervé DéjeanCombining Text and Heuristics for Cost-Sensitive Spam Filtering Jóse M. Gómez Hidalgo, Manuel Maña López, Enrique Puertas Sanz Genetic Algorithms for Feature Relevance Assignment in Memory-Based Language Processing , Anne Kool, Walter Daelemans, Jakub ZavrelShallow Parsing by Inferencing with Classifiers, Vasin Punyakanok, Dan RothMinimal Commitment and Full Lexical Disambiguation: Balancing Rules and Hidden Markov Models , Patrick Ruch, Robert Baud, Pierrette Bouillon, Gilbert RobertLearning IE Rules for a Set of Related Concepts , J. Turmo, H. RodríguezA Default First Order Family Weight Determination Procedure for WPDV models , Hans van HalterenA Comparison of PCFG Models , Jose Luis Verdú-Mas, Jorge Calera-Rubio, Rafael C. Carrasco |
||
19:00 |
Banquet |
Thursday September 14, 2000 |
||||||||||
LLL/CoNLL |
R 02.3 |
|||||||||
9.30 |
Incorporating Linguistics Constraints into Inductive Logic Programming James Cussens, Stephen Pulman |
|||||||||
10.00 |
Increasing our Ignorance of Language: Identifying Language Structure in an Unknown 'Signal' , John Elliott, Eric Atwell, Bill Whyte |
|||||||||
10.30 |
Break |
|||||||||
11.00 |
Invited Talk: Learning in Natural Language: Theory and Algorithmic Approaches, Dan Roth (University of Illinois at Urbana-Champaign) |
|||||||||
|
CoNLL |
R 02.3 |
LLL |
R 01.1 |
||||||
12.00 |
A Comparison between Supervised Learning Algorithms for Word Sense Disambiguation Gerard Escudero, Lluís Màrquez, German Rigau |
Recognition and Tagging of Compound Verb Groups in Czech Eva Zácková, Lubos Popelínsky, Milos Nepil
|
||||||||
12.30 |
The Role of Algorithm Bias vs Information Source in Learning Algorithms for Morphosyntactic Disambiguation Guy De Pauw, Walter Daelemans |
|||||||||
13.00 |
Lunch |
Holiday Inn |
||||||||
CoNLL |
R 02.3 |
LLL |
R 01.1 |
|||||||
14.00 |
Incorporating Position Information into a Maximum Entropy/Minimum Divergence Translation Model George Foster |
Invited Talk: Jörg-Uwe Kietz, Raphael Volz, Alexander Maedche |
||||||||
14.30 |
Overfitting Avoidance for Stochastic Modeling of Attribute-Value Grammars , Tony Mullen, Miles Osborne |
|||||||||
15.00 |
SIGNLL Business Meeting |
Learning from Parsed Sentences with INTHELEX , F. Esposito, S. Ferilli, N. Fanizzi, G. Semeraro |
||||||||
15:40 |
Break |
|||||||||
CoNLL |
R 02.3 |
LLL |
R 01.1 |
|||||||
16:10 |
CoNLL Shared Task Session Introduction to the CoNLL-2000 Shared Task: Chunking Erik F. Tjong Kim Sang, Sabine Buchholz Phrase Parsing with Rule Sequence Processors: an Application to the Shared CoNLL TaskMarc Vilain, David Day A Context Sensitive Maximum Likelihood Approach to Chunking Christer Johansson Improving Chunking by Means of Lexical-Contextual Information in Statistical Language Models Ferran Pla, Antonio Molina, Natividad Prieto Single-Classifier Memory-Based Phrase Chunking Jorn Veenstra, Antal van den Bosch Shallow Parsing as Part-of-Speech Tagging Miles Osborne Chunking with Maximum Entropy Models Rob Koeling Learning Syntactic Structures with XML Hervé Déjean Hybrid Text Chunking GuoDong Zhou, Jian Su, TongGuan Tey Text Chunking by System Combination Erik F. Tjong Kim Sang Chunking with WPDV Models Hans van Halteren Use of Support Vector Learning for Chunk Identification Taku Kudoh, Yuji Matsumoto |
Inductive Logic Programming for Corpus-Based Acquisition of Semantic Lexicons Pascale Sébillot, Pierrette Bouillon, Cécile Fabre Learning from a Substructural Perspective Pieter Adriaans, Erik de Haas |
||||||||
18:30 |
Closing session and meeting |
R 02.3 |
17:30 |
Closing session and meeting |
R 01.1 |