Jagesh Maharjan (Jugs)
    • About
    • News
    Bidirectional Encoder Representation of Transformer (BERT)
    Jagesh Maharjan (Jugs)

    Jagesh Maharjan (Jugs)

    My research interest is in Machine Learning & Deep Learning, especially in Natural Language Processing (NLP) and computer vision (CV)

    • Shenzhen
    • Email
    • LinkedIn
    • Twitter
    • Facebook
    • GitHub

    Bidirectional Encoder Representation of Transformer (BERT)

    less than 1 minute read

    BERT

    Pre-Training

    Fine Tuning

    In progress …. will be updated very soon.

    Tags: embedding, NLP, NLU, Transformer

    Updated: February 14, 2019

    Share on

    Twitter Facebook Google+ LinkedIn
    Previous Next

    You may also enjoy

    DeepSpeech by Baidu Inc.

    5 minute read

    Machine Learning, Deep Learning, CTC, RNN, LSTM, language model

    Dynamic Programming::Rod Cutting Problem

    1 minute read

    Data Structure & Algorithms, Dynamic Programming, Optimization

    Dynamic Programming::Change Making Problem

    2 minute read

    Data Structure & Algorithms, Dynamic Programming, Optimization

    Dynamic Programming::Knapsack Problem

    1 minute read

    Data Structure & Algorithms, Dynamic Programming, Optimization

    • Follow:
    • Twitter
    • Facebook
    • GitHub
    • Feed
    © 2019 Jagesh Maharjan. Powered by Jekyll & Minimal Mistakes.