Skip to main content

A shared decalogue on Big Data

Cube

A shared decalogue on Big Data was the aim of the meeting organised by Lepida on May 22nd, a moment of strong participation that saw the presence of over 60 people representing the Emilia-Romagna Region, municipalities/unions, provinces, health authorities, decontamination consortia, the Regional agencies ARPAE, ERGO, ART-ER and 7 leading suppliers in the market.

A representative for each stakeholder, whether an Institution or a Supplier, expressed his or her thoughts on Big Data in just 3' minutes, without slides, for a roundup of experiences... and inexperiences. Lepida DG Gianluca Mazzini, in the presentation of the event, urged the participants to express themselves on the definition of Open and Big Data, on the identification of data that are/are not already available, on defining the purposes of those data, on directing such purposes for processing, obtaining results, informing citizens.

The elements of the decalogue resulting from the meeting can be summarized as follows. 1] Citizen by design: designing services on Big Data thinking about the end user, the citizen, the associations, the enterprises. Optimizing, simplifying, and making efficient the infrastructural resources. Interdisciplinarity and information return. 2] Legal aspects: purpose and treatment. Governance and planning. Privacy and ownership/availability of data. 3] Knowledge, Awareness, Competence: each company’s should know its own data, but also those of third parties which could be helpful. Understanding the difference between Open and Big data, defining the data life cycle. Identifying the skills and competences of the professionals who will shape the data. 4] Standardization of data, use cases and vocabularies: data efficiency and normalization. Identification of use cases and areas where the Big Data approach may lead to otherwise unattainable results. Analysis of update volumes and frequencies, as well as attention to historicity. 5] Complexity: to start a virtuous path that, as the complexity of the elaboration increases, allows a cost reduction for the overall system. The need to create 'interdisciplinary teams' that allow data to be aggregated within but also beyond borders. 6] Interoperability, processes, algorithms, models, services: definition, standardization and sharing of interoperability modes for systems, processes, algorithms, models and services related to Big Data. Construction of natively integrated ecosystems, allowing to recognize the subjects feeding the data from the subjects utilizing them. 7] Quality: data are numerous and of various nature, often pulverized even within the same organizations; it is necessary to have clean data, to be certain of the information treated and to know what should be expected in treating data in the long run (10/20 years). Data quality and quality of algorithms, certified or certifiable. 8] Training and Information: the lack of objectives, the lack of information and the lack of knowledge/competence must be overcome, providing training and information at all levels, within the Bodies and to the citizens. 9] Collaboration: involvement of small and large organizations, reuse, best practices, social value of the data. 10] Public Data, Private Data: virtuous sharing of data, identification of the data from Public Bodies that may be interesting for the Private Sector, and vice versa. A decalogue to be integrated and perfected, also keeping Artificial Intelligence in mind.
 

Published on