tailieunhanh - VinaGPT-2: Generative pretrained transformer for Vietnamese
When thinking about Artificial Intelligence, people envision a machine that has quite an ability to perform human-exclusive tasks. And programs that can communicate and/or converse with many similarities to a human being are really convincing. Fortunately, such a machine is not too far-fetched for our current technological advancement: GPT-2 - a model that excels at generating comprehensive passages down-streamed for a wide range of writing purposes - is currently close to that machine. But given how humongous a model is in English, we will need ways to turn the model to Vietnamese if we want to build creative things with it. This paper will tell how has VinaGPT-2alpha been pre-trained by us. |
đang nạp các trang xem trước