<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><meta http-equiv="Content-Type" content="text/html; charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><span class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0); font-size: 14px;"><span class=""><b class="">DLnLD: Deep Learning and Linked Data — Last Call for Paper</b></span></span><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"> Workshop colocated with LREC-COLING 2024,</div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"> Date: May 21, 2024</div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"> Submissions due: 9th March 2024</div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"> Venue: Torino, Italy and online</div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"> For up to date info, check: <a href="https://dl-n-ld.github.io/" class="">https://dl-n-ld.github.io/</a></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><b class="">Call for Papers</b></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);">----------------------------------------------------------------------------------------</div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><i class="">What does Linguistic Linked Data brings to Deep Learning and vice versa ? Let’s bring together these two complementary approaches in NLP.</i></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><div class="">----------------------------------------------------------------------------------------</div></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><b class=""><i class="">Motivations for the Workshop<br class=""></i></b><br class="">Since the appearance of transformers (Vaswani et al., 2017), Deep Learning (DL) and neural approaches have brought a huge contribution to Natural Language Processing (NLP) either with highly specialized models for specific application or via Large Language Models (LLMs) (Devlin et al., 2019; Brown et al., 2020; Touvron et al., 2023) that are efficient few-shot learners for many NLP tasks. Such models usually build on huge web-scale data (raw multilingual corpora and annotated specialized, task related, corpora) that are now widely available on the Web. This approach has clearly shown many successes, but still suffers from several weaknesses, such as the cost/impact of training on raw data, biases, hallucinations, explainability, among others (Nah et al., 2023).<br class=""><br class="">The Linguistic Linked Open Data (LLOD) (Chiarcos et al., 2013) community aims at creating/distributing explicitly structured data (modelled as RDF graphs) and interlinking such data across languages. This collection of datasets, gathered inside the LLOD Cloud (Chiarcos et al., 2020), contains a huge amount of multilingual ontological (e.g. DBpedia (Lehmann et al., 2015)); lexical (e.g., DBnary (Sérasset, 2015), Wordnet (McCrae et al., 2014), Wikidata (Vrandečić and Krötzsch, 2014)); or linguistic (e.g., Universal Dependencies Treebank (Nivre et al., 2020; Chiarcos et al., 2021), DBpedia Abstract Corpus (Brümmer et al., 2016)) information, structured using common metadata (e.g., OntoLex (McCrae et al., 2017), NIF (Hellmann et al., 2013), etc.) and standardised data categories (e.g., lexinfo (Cimiano et al., 2011), OliA (Chiarcos and Sukhareva, 2015)).<br class=""><br class="">Both communities bring striking contributions that seem to be highly complementary. However, if knowledge (ontological) graphs are now routinely used in DL, there is still very few research studying the value of Linguistic/Lexical knowledge in the context of DL. We think that, today, there is a real opportunity to bring both communities together to take the best of both worlds. Indeed, with more and more work on Graph Neural Networks (Wu et al., 2023) and Embeddings on RDF graphs (Ristoski et al., 2019), there is more and more opportunity to apply DL techniques to build, interlink or enhance Linguistic Linked Open Datasets, to borrow data from the LLOD Cloud for enhancing Neural Models on NLP tasks, or to take the best of both worlds for specific NLP use cases.<br class=""><br class=""><b class=""><i class="">Submission Topics<br class=""></i></b><br class="">This workshop aims at gathering researchers that work on the interaction between DL and LLOD in order to discuss what each approach has to bring to the other. For this, we welcome contributions on original work involving some of the following (non exhaustive) topics:<br class=""><br class=""><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Deep Learning for Linguistic Linked Data, among which (but not exclusively):<br class=""></span><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Modelling, Resources & Interlinking,<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Relation Extraction<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Corpus annotation<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Ontology localization<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Knowledge/Linguistic Graphs creation or expansion<br class=""></span></div></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Linguistic Linked Data for Deep Learning, among which (but not exclusively):<br class=""></span><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Linguistic/Knowledge Graphs as training data<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Fine tuning LLMs using Linguistic Linked (meta)Data<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Graph Neural Networks<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Knowledge/Linguistic Graphs embeddings<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• LLOD for model explainability/sourcing<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Neural models for under-resourced languages<br class=""></span></div></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Joint Deep Learning and Linguistic Data applications<br class=""></span><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Use cases combining Language Models and Structured Linguistic Data<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• LLOD and DL for Digital Humanities<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Question-Answering on graph data<br class=""></span></div></div>All application domains (Digital Humanities, FinTech, Education, Linguistics, Cybersecurity…) as well as approaches (NLG, NLU, Data Extraction…) are welcome, provided that the work is based on the use of BOTH Deep Learning techniques and Linguistic Linked (meta)Data.<br class=""><br class=""><b class=""><i class="">Important Dates<br class=""></i></b><br class="">All deadlines are 11:59PM UTC-12:00 (“anywhere on Earth”)<br class=""><br class=""><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Submissions due: 9th March 2024 (Hard deadline: there will be no deadline extension)</span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Notification of acceptance: 2nd April 2024<br class=""></span></div><div class=""><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Camera-ready due: 12th April 2024</span></div></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><span class=""><br class=""></span></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><b class=""><i class="">Authors kit</i></b><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><span class=""><br class=""></span></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><span class="">All papers must follow the LREC-COLING 2024 two-column</span><span class=""> </span><span class="">format, using the supplied official style files. The templates</span><span class=""> </span><span class="">can be downloaded from the Style Files and Formatting page</span><span class=""> </span><span class="">provided on the website. Please do not modify these style</span><span class=""> </span><span class="">files, nor should you use templates designed for other</span><span class=""> </span><span class="">conferences. Submissions that do not conform to the required</span><span class=""> </span><span class="">styles, including paper size, margin width, and font size</span><span class=""> restrictions, will be rejected without review.</span></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class="">LREC-COLING 2024 Author’s Kit Page: <a href="https://lrec-coling-2024.org/authors-kit/" class="">https://lrec-coling-2024.org/authors-kit/</a> <br class=""><br class=""><b class=""><i class="">Paper submission<br class=""></i></b><br class="">Submission is electronic at <a href="https://softconf.com/lrec-coling2024/dlnld2024/" class="">https://softconf.com/lrec-coling2024/dlnld2024/</a></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><br class=""></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><b class=""><i class="">Workshop Chairs<br class=""></i></b></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><b class=""><i class=""><br class=""></i></b></div><div class="" style="caret-color: rgb(0, 0, 0); color: rgb(0, 0, 0);"><span class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Gilles Sérasset, Université Grenoble Alpes, France<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Hugo Gonçalo Oliveira, University of Coimbra, Portugal<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Giedre Valunaite Oleskeviciene, Mykolas Romeris University, Lithuania<br class=""><div class=""><b class=""><i class="">Program Committee<br class=""></i></b></div><div class=""><b class=""><i class=""><br class=""></i></b></div><div class=""><font color="#000000" class=""><span class="Apple-tab-span" style="white-space: pre;"> </span><span class="">• Mehwish Alam, Télécom Paris, Institut Polytechnique de Paris, France</span></font></div><span class="Apple-tab-span" style="white-space: pre;"> </span>• Russa Biswas, Hasso Plattner Institute, Potsdam, Germany<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Milana Bolatbek, Al-Farabi Kazakh National University, Kazakhstan<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Michael Cochez, Vrije Universiteit Amsterdam, Netherlands<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Milan Dojchinovski, Czech Technical University in Prague, Czech Republic<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Basil Ell, University of Oslo, Norway<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Robert Fuchs, University of Hamburg, Germany<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Radovan Garabík, L’. Štúr Institute of Linguistics, Slovak Academy of Sciences, Slovakia<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Daniela Gifu, Romanian Academy, Iasi branch & Alexandru Ioan Cuza University of Iasi, Romania<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Katerina Gkirtzou, Athena Research Center, Maroussi, Greece<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Jorge Gracia del Río, University of Zaragoza, Spain<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Dagmar Gromann, University of Vienna, Austria<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Dangis Gudelis, Mykolas Romeris University, Lithuania<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Ilan Kernerman, Lexicala by K Dictionaries, Israel<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Chaya Liebeskind, Jerusalem College of Technology, Israel<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Marco C. Passarotti, Università Cattolica del Sacro Cuore, Milan, Italy<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Heiko Paulheim, University of Mannheim, Germany<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Alexandre Rademaker, IBM Research Brazil and EMAp/FGV, Brazil<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Georg Rehm, DFKI GmbH, Berlin, Germany<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Harald Sack, Karlsruhe Institute of Technology, Karlsruhe, Germany<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Didier Schwab, Université Grenoble Alpes, France<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Ranka Stanković, University of Belgrade, Serbia<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Andon Tchechmedjiev, IMT Mines Alès, France<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Dimitar Trajanov, Ss. Cyril and Methodius University – Skopje, Macedonia<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Ciprian-Octavian Truică, POLITEHNICA Bucharest, Romania<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Nicolas Turenne, Guangdong University of Foreign Studies, China<br class=""><span class="Apple-tab-span" style="white-space: pre;"> </span>• Slavko Žitnik, University of Ljubljana, Slovenia</span></div></div></div></div></div></div></div></body></html>