An efficient approach for low latency processing in stream data
Stream data is the data that is generated continuously from the different data sources and ideally defined as the data that has no discrete beginning or end. Processing the stream data is a part of big data analytics that aims at querying the continuously arriving data and extracting meaningful info...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
PeerJ Inc.
2021-03-01
|
Series: | PeerJ Computer Science |
Subjects: | |
Online Access: | https://peerj.com/articles/cs-426.pdf |
_version_ | 1818379489122451456 |
---|---|
author | Nirav Bhatt Amit Thakkar |
author_facet | Nirav Bhatt Amit Thakkar |
author_sort | Nirav Bhatt |
collection | DOAJ |
description | Stream data is the data that is generated continuously from the different data sources and ideally defined as the data that has no discrete beginning or end. Processing the stream data is a part of big data analytics that aims at querying the continuously arriving data and extracting meaningful information from the stream. Although earlier processing of such stream was using batch analytics, nowadays there are applications like the stock market, patient monitoring, and traffic analysis which can cause a drastic difference in processing, if the output is generated in levels of hours and minutes. The primary goal of any real-time stream processing system is to process the stream data as soon as it arrives. Correspondingly, analytics of the stream data also needs consideration of surrounding dependent data. For example, stock market analytics results are often useless if we do not consider their associated or dependent parameters which affect the result. In a real-world application, these dependent stream data usually arrive from the distributed environment. Hence, the stream processing system has to be designed, which can deal with the delay in the arrival of such data from distributed sources. We have designed the stream processing model which can deal with all the possible latency and provide an end-to-end low latency system. We have performed the stock market prediction by considering affecting parameters, such as USD, OIL Price, and Gold Price with an equal arrival rate. We have calculated the Normalized Root Mean Square Error (NRMSE) which simplifies the comparison among models with different scales. A comparative analysis of the experiment presented in the report shows a significant improvement in the result when considering the affecting parameters. In this work, we have used the statistical approach to forecast the probability of possible data latency arrives from distributed sources. Moreover, we have performed preprocessing of stream data to ensure at-least-once delivery semantics. In the direction towards providing low latency in processing, we have also implemented exactly-once processing semantics. Extensive experiments have been performed with varying sizes of the window and data arrival rate. We have concluded that system latency can be reduced when the window size is equal to the data arrival rate. |
first_indexed | 2024-12-14T02:03:36Z |
format | Article |
id | doaj.art-19ab9f37e3c44f3e9a5b13562abdf1c6 |
institution | Directory Open Access Journal |
issn | 2376-5992 |
language | English |
last_indexed | 2024-12-14T02:03:36Z |
publishDate | 2021-03-01 |
publisher | PeerJ Inc. |
record_format | Article |
series | PeerJ Computer Science |
spelling | doaj.art-19ab9f37e3c44f3e9a5b13562abdf1c62022-12-21T23:20:57ZengPeerJ Inc.PeerJ Computer Science2376-59922021-03-017e42610.7717/peerj-cs.426An efficient approach for low latency processing in stream dataNirav Bhatt0Amit Thakkar1Information Technology, Chandubhai S Patel Institute of Technology, CHARUSAT, Anand, Gujarat, IndiaComputer Science and Engineering, Chandubhai S Patel Institute of Technology, CHARUSAT, Anand, Gujarat, IndiaStream data is the data that is generated continuously from the different data sources and ideally defined as the data that has no discrete beginning or end. Processing the stream data is a part of big data analytics that aims at querying the continuously arriving data and extracting meaningful information from the stream. Although earlier processing of such stream was using batch analytics, nowadays there are applications like the stock market, patient monitoring, and traffic analysis which can cause a drastic difference in processing, if the output is generated in levels of hours and minutes. The primary goal of any real-time stream processing system is to process the stream data as soon as it arrives. Correspondingly, analytics of the stream data also needs consideration of surrounding dependent data. For example, stock market analytics results are often useless if we do not consider their associated or dependent parameters which affect the result. In a real-world application, these dependent stream data usually arrive from the distributed environment. Hence, the stream processing system has to be designed, which can deal with the delay in the arrival of such data from distributed sources. We have designed the stream processing model which can deal with all the possible latency and provide an end-to-end low latency system. We have performed the stock market prediction by considering affecting parameters, such as USD, OIL Price, and Gold Price with an equal arrival rate. We have calculated the Normalized Root Mean Square Error (NRMSE) which simplifies the comparison among models with different scales. A comparative analysis of the experiment presented in the report shows a significant improvement in the result when considering the affecting parameters. In this work, we have used the statistical approach to forecast the probability of possible data latency arrives from distributed sources. Moreover, we have performed preprocessing of stream data to ensure at-least-once delivery semantics. In the direction towards providing low latency in processing, we have also implemented exactly-once processing semantics. Extensive experiments have been performed with varying sizes of the window and data arrival rate. We have concluded that system latency can be reduced when the window size is equal to the data arrival rate.https://peerj.com/articles/cs-426.pdfData streamStream processingLatency |
spellingShingle | Nirav Bhatt Amit Thakkar An efficient approach for low latency processing in stream data PeerJ Computer Science Data stream Stream processing Latency |
title | An efficient approach for low latency processing in stream data |
title_full | An efficient approach for low latency processing in stream data |
title_fullStr | An efficient approach for low latency processing in stream data |
title_full_unstemmed | An efficient approach for low latency processing in stream data |
title_short | An efficient approach for low latency processing in stream data |
title_sort | efficient approach for low latency processing in stream data |
topic | Data stream Stream processing Latency |
url | https://peerj.com/articles/cs-426.pdf |
work_keys_str_mv | AT niravbhatt anefficientapproachforlowlatencyprocessinginstreamdata AT amitthakkar anefficientapproachforlowlatencyprocessinginstreamdata AT niravbhatt efficientapproachforlowlatencyprocessinginstreamdata AT amitthakkar efficientapproachforlowlatencyprocessinginstreamdata |