Paper Review - Connecting the Dots: Document-level Neural Relation Extraction with Edge-oriented Graphs
Authors: Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
Authors: Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
Cross-lingual learning Most languages do not have training data available to create state-of-the-art models and thus our ability to create intelligent systems for these languages is limited as well. Cross-lingual learning (CLL) is one possible remedy to solve the lack of data for low-resource languages. In essence, it is an effort to utilize annotated data from other languages when building new NLP models. When CLL is considered, target languages usually lack resources, while source languages are resource-rich and they can be used to improve the results for the former....
BERT Recap Overview Bert (Bidirectional Encoder Representations from Transformers) uses a “masked language model” to randomly mask some tokens from the input and predict the original vocabulary id of the masked token. Bert shows that “pre-trained representations reduce the need for many heavily-engineered task-specific architectures”. BERT Specifics There are two steps to the BERT framework: pre-training and fine-tuning During pre training, the model is trained on unlabeled data over different pre-training tasks....
Authors: Kevin Clark, Urvashi Khandelwal, Omer Levy, Christopher D. Manning
Authors: Kenneth Marino, Ruslan Salakhutdinov, Abhinav Gupta
Fourier Transform Virtually everything in the world can be described via a waveform - a function of time, space or some other variable. For instance, sound waves, the price of a stock, etc. The Fourier Transform gives us a unique and powerful way of viewing these waveforms: All waveforms, no matter what you scribble or observe in the universe, are actually just the sum of simple sinusoids of different frequencies....