Utilizing a Pretrained Language Model (BERT) to Classify Preservice Physics Teachers’ Written Reflections
- Location
-
Deutsche Nationalbibliothek Frankfurt am Main
- ISSN
-
1560-4306
- Extent
-
Online-Ressource
- Language
-
Englisch
- Notes
-
online resource.
- Bibliographic citation
-
Utilizing a Pretrained Language Model (BERT) to Classify Preservice Physics Teachers’ Written Reflections ; day:2 ; month:5 ; year:2022 ; pages:1-28
International journal of artificial intelligence in education ; (2.5.2022), 1-28
- Creator
- Contributor
-
SpringerLink (Online service)
- DOI
-
10.1007/s40593-022-00290-6
- URN
-
urn:nbn:de:101:1-2022071021491140363036
- Rights
-
Open Access; Der Zugriff auf das Objekt ist unbeschränkt möglich.
- Last update
-
15.08.2025, 7:20 AM CEST
Data provider
Deutsche Nationalbibliothek. If you have any questions about the object, please contact the data provider.
Associated
- Wulff, Peter
- Mientus, Lukas
- Nowak, Anna
- Borowski, Andreas
- SpringerLink (Online service)