Syntactic Analysis: Literature seminars During the literature seminars we will discuss the given scientific articles, and questions related to them. You are supposed to prepare according to the guidelines below. The seminars are meant as learning opportunities, where you will get help to understand two scientific articles related to the course in detail. Both articles present methods for using neural networks in parsing, thus complementing the lectures.
We are aware that you did not yet take the machine learning course. You may thus view the neural networks mainly as black boxes, and do not have to understand that part, including the maths, in detail.
Groups
The seminars will be held in smaller groups.Preparation
For both seminars you are expected to read the assigned article carefully. You should also prepare answers to the questions and discussion points below and bring them to the seminar.- Seminar 1:
Chris Dyer, Adhiguna Kuncoro, Miguel Ballesteros, Noah A. Smith. Recurrent Neural Network Grammars. NAACL 2016. (You do not need to go into any detail about the math, just strive to understand the reasoning behind it.)
- The seminar will be held in small groups on Zoom (link in Studium), Wednesday February 9, 2022.
- Group 1 (Moa, Ali, Ebba, Mathias, Thea): 10.15-11
- Group 2 (Yini, Iliana, Annika, Matilde, Yiu Kei): 11.15-12
- Questions and discussion points:
- Can you briefly summarize the article (in about a minute)?
- What do you think is the most important point the authors make in the article?
- What do YOU personally think is the most interesting point in the article?
- What are the differences between RNNG and CKY/Earley?
- Why is the time complexity of RNNGs lower than for CKY and Earley?
- What did you learn when reading the article? (Or: why do you think you did not learn anything?)
- Is there something you do not understand or that is dificult to understand?
- Come up with some further questions/issues to discuss!
- Would you recommend this article? Why or why not?
- The seminar will be held in small groups on Zoom (link in Studium), Wednesday February 9, 2022.
- Seminar 2: Eliyahu Kiperwasser and Yoav Goldberg. Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations. TACL. Volume 4, 2016
- The seminar will be held in small groups on Zoom, Wednesday March 2, 2020:
- Group 1 (Moa, Ali, Ebba, Mathias, Thea): 10.15-11
- Group 2 (Yini, Iliana, Matilde, Yiu Kei): 11.15-12
- Questions:
- Can you briefly summarize the article (in about a minute)?
- What do you think is the most important point the authors make in the article?
- What do YOU personally think is the most interesting point in the article?
- Compare graph-based and transition-based dependency parsing. What are the main strengths and weaknesses of each?
- Tables 1 and 3 shows the results both for the K&G parser and for other parsers. Try to see what the impact seems to be of techniques like: external word embeddings, beam search, dyanmic oracles, and POS-tags (gold or predicted) by comparing scores for systems with different variants with respect to each technique.
- What did you learn when reading the article? (Or: why do you think you did not learn anything?)
- Is there something you do not understand or that is dificult to understand?
- Come up with some further questions/issues to discuss!
- Would you recommend this article? Why or why not?
- The seminar will be held in small groups on Zoom, Wednesday March 2, 2020: