| ... |
... |
@@ -13,7 +13,7 @@ |
| 13 |
13 |
{{box cssClass="box_green"}} |
| 14 |
14 |
== SDMX Implementation == |
| 15 |
15 |
|
| 16 |
|
-International standards like Statistical Data and Metadata eXchange ([[SDMX>>https://sdmx.org/||rel="noopener noreferrer" target="_blank"]]) have provided a robust foundation for metadata exchange in official statistics. However, our experience has revealed significant limitations influencing the achievement of semantic interoperability. SKMS addresses these gaps by integrating SDMX structures into a semantic interpretation environment via the [[Interoperability Basis platform>>https://basis.semanticip.org/xwiki/bin/view/Main/||rel="noopener noreferrer" target="_blank"]]. The platform supports semantic alignment, enrichment, and publication of data exchange standards using a knowledge management system, modeling tools, namespace control, and persistent [[URI>>https://www.w3.org/Addressing/URL/uri-spec.html||rel="noopener noreferrer" target="_blank"]] infrastructure. |
|
16 |
+International standards like Statistical Data and Metadata eXchange ([[SDMX>>https://sdmx.org/||rel="noopener noreferrer" target="_blank"]]) have provided a robust foundation for metadata exchange in official statistics. However, our experience has revealed significant limitations influencing the achievement of semantic interoperability. SKMS addresses these gaps by integrating SDMX structures into a semantic interpretation environment via the [[Interoperability Basis platform>>https://basis.semanticip.org/xwiki/bin/view/Main/]]. The platform supports semantic alignment, enrichment, and publication of data exchange standards using a knowledge management system, modeling tools, namespace control, and persistent [[URI>>https://www.w3.org/Addressing/URL/uri-spec.html||rel="noopener noreferrer" target="_blank"]] infrastructure. |
| 17 |
17 |
{{/box}} |
| 18 |
18 |
|
| 19 |
19 |
== Linked Data == |
| ... |
... |
@@ -26,13 +26,13 @@ |
| 26 |
26 |
|
| 27 |
27 |
The High-Level Group for the Modernisation of Official Statistics ([[HLG-MOS>>https://unece.org/statistics/networks-of-experts/high-level-group-modernisation-statistical-production-and-services||rel="noopener noreferrer" target="_blank"]]), under the United Nations Economic Commission for Europe ([[UNECE>>https://unece.org/ru||rel="noopener noreferrer" target="_blank"]]), addresses the challenges of data interoperability within national statistical systems. It develops and promotes methods, models (including semantic models such as ontologies), and standards through coordinated initiatives. One of these initiatives is the Data Governance Framework for Statistical Interoperability ([[DAFI>>https://unece.org/sites/default/files/2024-03/HLG2023%20DAFI%20Final_0.pdf]]), published in 2023. This framework provides a reference model for implementing governance programs that support the creation, sharing, and use of data in ways that preserve semantic meaning across systems. |
| 28 |
28 |
|
| 29 |
|
-Another priority of [[HLG-MOS>>https://unece.org/statistics/networks-of-experts/high-level-group-modernisation-statistical-production-and-services||rel="noopener noreferrer" target="_blank"]] is the development of [[rich (“smart”) metadata>>http://cosmos-conference.org/index.html||rel="noopener noreferrer" target="_blank"]] — metadata that is standardised (understandable and reusable across contexts), active (capable of driving statistical processes), and aligned with the [[FAIR principles>>https://www.go-fair.org/||rel="noopener noreferrer" target="_blank"]] : Findable, Accessible, Interoperable, and Reusable. |
|
29 |
+Another priority of [[HLG-MOS>>https://unece.org/statistics/networks-of-experts/high-level-group-modernisation-statistical-production-and-services||rel="noopener noreferrer" target="_blank"]] is the development of [[rich (“smart”) metadata>>http://cosmos-conference.org/index.html]] — metadata that is standardised (understandable and reusable across contexts), active (capable of driving statistical processes), and aligned with the [[FAIR principles>>https://www.go-fair.org/||rel="noopener noreferrer" target="_blank"]] : Findable, Accessible, Interoperable, and Reusable. |
| 30 |
30 |
|
| 31 |
31 |
We share these goals and move forward in step with [[HLG-MOS>>https://unece.org/statistics/networks-of-experts/high-level-group-modernisation-statistical-production-and-services||rel="noopener noreferrer" target="_blank"]] initiatives — SKMS already reflects key principles and objectives that resonate with this international agenda. |
| 32 |
32 |
|
| 33 |
33 |
A key enabler of [[FAIR>>https://www.go-fair.org/||rel="noopener noreferrer" target="_blank"]] implementation in statistics is the use of semantic technologies for both data dissemination and the formalization of knowledge in the form of semantic models (semantic assets). Semantic assets (SAs) are reusable formal representations of data such as: (1) metadata schemas (e.g. XML or RDF), (2) core data models or common models, (3) ontologies, thesauri, and reference data (e.g. code lists, taxonomies, glossaries). These assets are published as open data standards and used in the development of knowledge management systems, harmonizing indicators and classifications, and preparing LOSD. Semantic models support unambiguous interpretation, semantic search, and the discovery of data across disparate sources. |
| 34 |
34 |
|
| 35 |
|
-The adoption of LOSD creates new opportunities for discovering, searching, comparing, and integrating statistical data from multiple sources through [[Semantic Web>>https://www.w3.org/standards/||rel="noopener noreferrer" target="_blank"]] technologies, including semantic integration methods. This approach enables the achievement of the highest level of data maturity according to the [[5-star model>>https://5stardata.info/en/||target="_blank"]] proposed by Tim Berners-Lee. |
|
35 |
+The adoption of LOSD creates new opportunities for discovering, searching, comparing, and integrating statistical data from multiple sources through [[Semantic Web>>https://www.w3.org/standards/||rel="noopener noreferrer" target="_blank"]] technologies, including semantic integration methods. This approach enables the achievement of the highest level of data maturity according to the 5-star model proposed by Tim Berners-Lee. |
| 36 |
36 |
|
| 37 |
37 |
== Operational Cycle == |
| 38 |
38 |
|
| ... |
... |
@@ -40,15 +40,15 @@ |
| 40 |
40 |
|
| 41 |
41 |
The full operational cycle consists of seven stages: |
| 42 |
42 |
|
| 43 |
|
-1. Collection and systematization of methodological documents (creation of an electronic [[library>>doc:working:Library.WebHome||target="_blank"]]), adding annotations, discovering terms-candidates and primary markup with related terms and documents. Publishing documents in original structured form with hypertext markup in a specialized "[[Methodology>>doc:working:Methodology.WebHome||target="_blank"]]" section. |
| 44 |
|
-1. The development of [[glossaries>>doc:working:Glossary.WebHome||target="_blank"]] (the formation of detailed terminological articles), indicators descriptions based on the analysis of methodological documents, and then the generation of corresponding semantic assets. Refinement of hyper-text markup in accordance with modelled glossaries. |
|
43 |
+1. Collection and systematization of methodological documents (creation of an electronic library), adding annotations, discovering terms-candidates and primary markup with related terms and documents. Publishing documents in original structured form with hypertext markup in a specialized "Methodology" section. |
|
44 |
+1. The development of glossaries (the formation of detailed terminological articles), indicators descriptions based on the analysis of methodological documents, and then the generation of corresponding semantic assets. Refinement of hyper-text markup in accordance with modelled glossaries. |
| 45 |
45 |
1. Publishing semantic assets generated in the SKMS. |
| 46 |
46 |
1. Development, aligning and cataloging of necessary SA, code lists or other models of statistical domains in accordance with semantic standards. |
| 47 |
|
-1. Importing datasets from external sources or data warehouses (DWH). Transformation of datasets using the [[RDF Data Cube Vocabulary>>https://www.w3.org/TR/vocab-data-cube/||target="_blank"]], semantic enrichment. |
|
47 |
+1. Importing datasets from external sources or data warehouses (DWH). Transformation of datasets using the RDF Data Cube Vocabulary, semantic enrichment. |
| 48 |
48 |
1. Visualization and validation of semantic models and LOSD sets. |
| 49 |
49 |
1. Construction of rich metadata that is transmitted for publishing in external analytical systems. |
| 50 |
50 |
|
| 51 |
|
-SKMS is based on the XWiki extension to using semantic technologies. It provides special templates for publishing documents, glossary terms, and indicator descriptions. They are used by domain experts to formalize statistical knowledge and provide their human-readable representation fixed in SAs. The LOSD pipeline is supported by generators and constructors developed to automate the formation of LOSD, semantic models and semantically enriched metadata. SKMS may be integrated with a cataloging service that supports not only the organisation of semantic assets, but also their visualization, access, and dissemination through standard interfaces such as [[OpenAPI>>https://www.openapis.org/||target="_blank"]] and [[SPARQL Endpoints>>https://sparql.dev/article/SPARQL_endpoints_and_how_to_use_them.html||target="_blank"]]. |
|
51 |
+SKMS is based on the XWiki extension to using semantic technologies. It provides special templates for publishing documents, glossary terms, and indicator descriptions. They are used by domain experts to formalize statistical knowledge and provide their human-readable representation fixed in SAs. The LOSD pipeline is supported by generators and constructors developed to automate the formation of LOSD, semantic models and semantically enriched metadata. SKMS may be integrated with a cataloging service that supports not only the organisation of semantic assets, but also their visualization, access, and dissemination through standard interfaces such as OpenAPI and SPARQL Endpoints. |
| 52 |
52 |
|
| 53 |
53 |
== Benefits == |
| 54 |
54 |
|