Agenda
- Transfer learning before BERT
- Transfer learning with BERT
- BERT – Model (including Transformer)
- BERT – Learning (how training is done)
- BERT – Inference (how testing and downstream tasks are done)
勘誤:影片中說BERT不能用來做生成的任務是不正確的。以預測[MASK]而言,其實就能逐步生成詞再組合成句子,不過目前所知(2019 June)生成能力最好的是GPT-2。
【講者簡介】 Kota, 柯達 – Head of Machine Learning in Beluga AI
擅長技術:
- Customer Data Platform(使用者行為分析、用戶貼標、推薦系統、銷售預測)
- 生理聲音事件偵測(SED)
- 單據識別(OCR, template matching, fuzzy match)
- 對話機器人(Question answering chatbot, NER, Intent detection, Multi-turn dialogue management)
- 語音合成(text-to-speech in mandarin)