Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Tensor-based big data management scheme for dimensionality reduction problem in smart grid systems: SDN perspective

Kaur, Devinder, Aujla, Gagangeet Singh, Kumar, Neeraj, Zomaya, Albert Y., Perera, Charith ORCID: https://orcid.org/0000-0002-0190-3346 and Ranjan, Rajiv 2018. Tensor-based big data management scheme for dimensionality reduction problem in smart grid systems: SDN perspective. IEEE Transactions on Knowledge and Data Engineering 30 (10) , pp. 1985-1998. 10.1109/TKDE.2018.2809747

[thumbnail of 08302840 (3).pdf]
Preview
PDF - Accepted Post-Print Version
Download (2MB) | Preview

Abstract

Smart grid (SG) is an integration of traditional power grid with advanced information and communication infrastructure for bidirectional energy flow between grid and end users. A huge amount of data is being generated by various smart devices deployed in SG systems. Such a massive data generation from various smart devices in SG systems may lead to various challenges for the networking infrastructure deployed between users and the grid. Hence, an efficient data transmission technique is required for providing desired QoS to the end users in this environment. Generally, the data generated by smart devices in SG has high dimensions in the form of multiple heterogeneous attributes, values of which are changed with time. The high dimensions of data may affect the performance of most of the designed solutions in this environment. Most of the existing schemes reported in the literature have complex operations for the data dimensionality reduction problem which may deteriorate the performance of any implemented solution for this problem. To address these challenges, in this paper, a tensor-based big data management scheme is proposed for dimensionality reduction problem of big data generated from various smart devices. In the proposed scheme, first the Frobenius norm is applied on high-order tensors (used for data representation) to minimize the reconstruction error of the reduced tensors. Then, an empirical probability-based control algorithm is designed to estimate an optimal path to forward the reduced data using software-defined networks for minimization of the network load and effective bandwidth utilization. The proposed scheme minimizes the transmission delay incurred during the movement of the dimensionally reduced data between different nodes. The efficacy of the proposed scheme has been evaluated using extensive simulations carried out on the data traces using ‘R’ programming and Matlab. The big data traces considered for evaluation consist of more than two million entries (2,075,259) collected at one minute sampling rate having hetrogenous features such as–voltage, energy, frequency, electric signals, etc. Moreover, a comparative study for different data traces and a real SG testbed is also presented to prove the efficacy of the proposed scheme. The results obtained depict the effectiveness of the proposed scheme with respect to the parameters such as- network delay, accuracy, and throughput.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Computer Science & Informatics
Subjects: Q Science > QA Mathematics > QA76 Computer software
Publisher: IEEE
ISSN: 1041-4347
Date of First Compliant Deposit: 25 September 2018
Date of Acceptance: 18 February 2018
Last Modified: 07 Nov 2023 11:32
URI: https://orca.cardiff.ac.uk/id/eprint/114936

Citation Data

Cited 64 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics