Skip to content

Latest commit

 

History

History
22 lines (20 loc) · 2.61 KB

README.md

File metadata and controls

22 lines (20 loc) · 2.61 KB

Suggested readings:

  1. Natural Language Processing (Almost) from Scratch.
    Propose a unified neural network architecture for sequence labeling tasks.
  2. Neural Architectures for Named Entity Recognition.
    End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF.
    Combine Character-based word representations and word representations to enhance sequence labeling systems.
  3. Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks.
    Multi-task Multi-domain Representation Learning for Sequence Tagging.
    Transfer learning for sequence tagging.
  4. Named Entity Recognition for Chinese Social Media with Jointly Trained Embeddings.
    Propose a joint training objective for the embeddings that makes use of both (NER) labeled and unlabeled raw text
  5. Improving Named Entity Recognition for Chinese Social Media with Word Segmentation Representation Learning.
    An Empirical Study of Automatic Chinese Word Segentation for Spoken Language Understanding and Named Entity Recognition.
    Using word segmentation outputs as additional features for sequence labeling syatems.
  6. Semi-supervised Sequence Tagging with Bidirectional Language Models.
    State-of-the-art model on Conll03 NER task, adding pre-trained context embeddings from bidirectional language models for sequence labeling task.
  7. Character-Based LSTM-CRF with Radical-Level Features for Chinese Named Entity Recognition.
    State-of-the-art model on SIGHAN2006 NER task.
  8. Named Entity Recognition with Bidirectional LSTM-CNNs.
    Method to apply lexicon features.