BIROn - Birkbeck Institutional Research Online

    Mathematical theories in the era of Big Data

    Zumpano, E. and Caroprese, L. and Veltri, P. and Calì, Andrea and Radulescu, F. (2019) Mathematical theories in the era of Big Data. Mathematical Problems in Engineering 2019 , pp. 1-2. ISSN 1024-123X.

    [img] Text
    27959.pdf - Published Version of Record
    Available under License Creative Commons Attribution.

    Download (1MB)

    Abstract

    Data integration concerns the process of acquiring and managing heterogeneous data to be used by means of a unified view. Data can be merged in a unique data structure and can reside on different data sources and can be reconciled in the user view. Data is growing and huge increasing volume of data is available in different information sources; thus that furnishing uniquely available user interface is always more interesting challenge. To address this, data integration has become, over the last decades, the focus of extensive computer science theoretical works focusing on schema alignment and data fusion. Nevertheless, many issues are still open problems and thus unsolved. The recent years have seen an impressive growth in the volume, speed, and heterogeneity of the generated data as well as in the variety and quality of the data. We are in the era of big data! Data is generated, collected, and processed at an unprecedented scale and data-driven decisions influence many aspects of modern society. Data integration contributes to rapid and efficient decisions and is required in social and life related areas such as emergency management, life quality, and health related data management. As a consequence, there is a growing interest in applying mathematical theories and methods to model, integrate, and manage massive and fast changing data and in retrieving the valid and valuable knowledge they imply. The target of this special issue was to disseminate recent research results on data integration and to promote the integration between data management and knowledge representation communities. The aim was to merge articles describing novel theoretical as well as applied works regarding methodologies for big data modeling, integration, and management. In the paper “Big Data Validity Evaluation Based on MMTD” by N. Zhou et al., medium mathematics systems are introduced for the evaluation of big data validity. A medium logic-based data validity evaluation method is proposed. The contributions of the paper are as follows: based on the 3V properties of big data, dimensions that have a major influence on data validity are determined; data completeness, correctness, and compatibility are defined; a medium truth degree-based model is proposed to measure each dimension of data validity; a medium truth degree-based multidimensional model is proposed to measure the integrated value of data validity. In the paper “A Compound Structure for Wind Speed Forecasting Using MKLSSVM with Feature Selection and Parameter Optimization” by S. Sun et al., a compound MKLSSVM model optimized by HGSA algorithm integrated with signal decomposition technique EEMD, namely, EEMD-HGSA-MKLSSVM, is proposed for short-term wind speed forecasting. Four sets of mean half-hour wind speed, selected randomly from the historical wind speed data in 2015 and collected from a wind farm located in Anhui of China, are utilized as case studies to evaluate the forecasting performance of EEMD-HGSA-MKLSSVM model. In the paper “A Negotiation Optimization Strategy of Collaborative Procurement with Supply Chain Based on Multi-Agent System” by C. Chen and C. Xu, the process of collaborative procurement in which buyers and suppliers are prone to conflict in cooperation due to differences in needs and preferences is investigated. The paper provides a novel perspective for the analysis of intelligent supply chain managements; it constructs a negotiation model based on multi-agent system and proposes a negotiation optimization strategy combined with machine learning. In the paper “High-Order Degree and Combined Degree in Complex Networks” by S. Wang et al., several novel centrality metrics are defined: the high-order degree and combined degree of undirected network, the high-order out-degree and in-degree and combined out out-degree and in-degree of directed network. Those are the measurement of node importance with respect to the number of the node neighbors. Centrality metrics are explored in the context of several best-known networks and it is proved that both the degree centrality and eigenvector centrality are special cases of the high-order degree of undirected network, and both the in-degree and PageRank algorithm without damping factor are special cases of the high-order in-degree of directed network.

    Metadata

    Item Type: Article
    School: Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences
    Research Centres and Institutes: Data Analytics, Birkbeck Institute for
    Depositing User: Administrator
    Date Deposited: 28 Jun 2019 08:54
    Last Modified: 09 Aug 2023 12:46
    URI: https://eprints.bbk.ac.uk/id/eprint/27959

    Statistics

    Activity Overview
    6 month trend
    283Downloads
    6 month trend
    304Hits

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item
    Edit/View Item