Danish BERT

BERT (Bidirectional Encoder Representations from Transformers) is a deep neural network model used in Natural Language Processing. The network learns the grammar and semantics of human language by training on large bodies of text. Danish BERT focuses on making BERT better for the nordic languages.

This repository provides downloadable weights for a Danish, a Norwegian and a Swedish BERT model trained from scratch. The models can be used in downstream tasks to improve the performance of Nordic Natural Language Processing systems.

Data og Distribution(er)

Yderligere info

Felt Værdi
Destinationsside https://github.com/botxo/danish_bert
Metadata sidst opdateret december 8, 2022, 12:35 (UTC)
Metadata oprettet juni 18, 2020, 10:20 (UTC)
Emne Sprog og retskrivning Uddannelse, kultur og sport
GUID https://data.gov.dk/dataset/lang/ee8441c7-8ee9-46bc-92f3-0a79f572b62a
Kontaktnavn Certainly
Sprog dansk
URI https://data.gov.dk/dataset/lang/ee8441c7-8ee9-46bc-92f3-0a79f572b62a
Udgivernavn Certainly
type https://data.gov.dk/concept/core/lang-resource-type/Tool