Overview on Performance Testing Approach in Big Data

Ashlesha S. Nagdive, Dr. R. M. Tugnayat, Manish P. Tembhurkar

Abstract


Big data is defined as large amount of data which requires new technologies and architectures so that it becomes possible to extract value from it by capturing and analysis process. Big data due to its various properties like volume, velocity, variety, variability, value, complexity and performance put forward many challenges. Many organizations are facing challenges in facing test strategies for structured and unstructured data validation, setting up optimal test environment, working with non relational database and performing non functional testing. These challenges cause poor quality of data in production, delay in implementation and increase in cost. Map Reduce provides a parallel and scalable programming model for data-intensive business and scientific applications. To obtain the actual performance of big data applications, such as response time, maximum online user data capacity size, and a certain maximum processing capacity.

 

Keywords: Bigdata, Testing strategies, MapReduce, Hadoop, performance testing


Full Text:

PDF


DOI: https://doi.org/10.26483/ijarcs.v5i8.2355

Refbacks

  • There are currently no refbacks.




Copyright (c) 2016 International Journal of Advanced Research in Computer Science