1、AliceMind纯文本和多模态生成预训练技术及应用李晨亮高级算法工程师|01纯文本生成预训练PALM2.002多模态统一生成预训练模型0304总结目录 CONTENT|生成预训练业务应用纯文本生成预训练PALM2.001|样例:生成式问答(GenerativeQA)swer.Seo et al.(2017)proposed BiDAF that representscontext at different levels of granularity and uses the bi-directional attention flow mechanism for answer extracti
2、on.SLQA(Wang,Yan,and Wu 2018)improves answer qual-ity with a hierarchical attention fusion network in whichattention and fusion are conducted horizontally and verti-cally across layers between the question and the paragraph.Recently,we see emerging BERT-based models(Devlin etal.2018)which are proven
3、 effective for reading compre-hension.Multi-paragraph reading comprehension has alsoattracted interest from the academic(Yan et al.2019)andindustrial community(He et al.2018).Sequence-to-sequence QA.The sequence-to-sequencearchitecture has been broadly used in a variety of QAtasks without reading co
4、ntextual paragarphs.GenQA(Yinet al.2016)combines knowledge retrieval and sequence-to-sequence learning to produce fluent answers,but it onlydeals with simple questions containing one single fact.COREQA(He et al.2017)extends it with a copy mecha-nism,and can answer an information inquired question(i.
5、e.,a factual question containing one or more topic entities).In contrast,Fu and Feng(2018)introduced a new attentionmechanism that explores heterogeneous memory for answersentence generation.The new attention encourages the de-coder to actively interact with the memory in the memory-augmented encode
6、r-decoder framework.Moreover,Tao etal.(2018)proposed a multi-head attention mechanism tocapture multiple semantic aspects of a given query and gen-erate an informative response in a dialogue system.Natural Answer Generation.There have been severalattempts at using machine reading to generate natural