InTowards AIbyAla Falaki, PhDHow to Train a Seq2Seq Summarization Model Using “BERT” as Both Encoder and Decoder!! (BERT2BERT)BERT is a well-known and powerful pre-trained “encoder” model. Let’s see how we can use it as a “decoder” to form an encoder-decoder…Jun 20, 20225Jun 20, 20225
InTowards AIbyAla Falaki, PhDHow to Train a Seq2Seq Text Summarization Model With Sample Code (Ft. Huggingface/PyTorch)Part 2 of the introductory series about training a Text Summarization model (or any Seq2seq/Encoder-Decoder Architecture) with sample codes…Dec 14, 2021Dec 14, 2021
InTowards AIbyAla Falaki, PhDA Full Introduction on Text Summarization using Deep Learning With Sample Code (Ft. Huggingface)An introductory story about the inference process in the Abstractive Text Summarization task (Seq2seq/Encoder-Decoder Architecture) with…Nov 16, 2021Nov 16, 2021