Fluctuating Small Data Imputation with Lagrange Interpolation Based

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The data imputation process, the subject of this research, is a solution needed in the current era of big data. In addition to big data, small data often experiences similar problems such as data loss, especially on time series-based data. Several things can cause data loss, one of which is the data communication problem caused by the network on the Internet of Things (IoT) system. Losing data in data conditions that are quite volatile is also quite a challenge. Not all methods can produce good accuracy in dealing with fluctuating data cases. The method used to impute small volatile data in this study is Lagrange interpolation. Several methods used for the data imputation process consist of Lagrange interpolation of degree one (linear), degree two (quadratic), and degree three (cubic). Lagrange cubic interpolation generally provides better accuracy with data loss percentages of 10%, 30%, and 50% compared to several other types of polynomial interpolation indicated by the MSE value. Indirectly, the results of this research can predict that if the application of Lagrange interpolation on fluctuating data has good accuracy, then Lagrange interpolation can have the same accuracy as non-fluctuating data.

Cite

CITATION STYLE

APA

Oktaviani, I. D., Abdurohman, M., & Erfianto, B. (2023). Fluctuating Small Data Imputation with Lagrange Interpolation Based. In Smart Innovation, Systems and Technologies (Vol. 324, pp. 211–217). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-19-7447-2_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free