Determining the Effect of Similar Queries in Cache-based Cloud Datastores

Ruchi Nanda, Prof. Swati V. Chande, Prof. Krishna S. Sharma

Abstract


Caching is one of the techniques that facilitates reduction in number of database accesses, load of queries on the database server, speeds up the data retrieval process, and also reduction in the query processing time and overall response time. With the increase in size of data in cloud environment, the cloud datastores are expected to manage static data as well as dynamic data efficiently. Static databases are crucial to maintain, as all the decisions and analyses are made on the basis of the data available in these databases. A caching scheme is utilized with the aim to enhance the performance of static cloud database queries. The processing time is evaluated over different percentage of cache-hit queries. Experiments performed on HBase show that the time to retrieve the results of cache-miss queries gradually decreases as the cache starts getting populating with more cache-miss queries. There is also reduction in the time taken by the queries when directly executed in succession on non-cache based system. Therefore, an experiment is conducted to check the effect of similar cache-hit queries when run in succession on cache based system. The comparisons are made against the non-cache based system and it shows that successive trips to cache, lead to the performance improvement in cache-based datastore over non-cache based datastore by 48.64%, in terms of the cumulative processing time.

Keywords: Caching, column-based datastores, HBase, Cache-size, Cloud-based systems, cloud datastores, Static Cloud Datastore, Processing Time, Query Response Time

Full Text:

PDF


DOI: https://doi.org/10.26483/ijarcs.v8i3.3124

Refbacks

  • There are currently no refbacks.




Copyright (c) 2017 International Journal of Advanced Research in Computer Science