BigDataFr recommends: A Big Data Analyzer for Large Trace Logs
‘Current generation of Internet-based services are typically hosted on large data centers that take the form of warehouse-size structures housing tens of thousands of servers. Continued availability of a modern data center is the result of a complex orchestration among many internal and external actors including computing hardware, multiple layers of intricate software, networking and storage devices, electrical power and cooling plants. During the course of their operation, many of these components produce large amounts of data in the form of event and error logs that are essential not only for identifying and resolving problems but also for improving data center efficiency and management.
Most of these activities would benefit significantly from data analytics techniques to exploit hidden statistical patterns and correlations that may be present in the data. The sheer volume of data to be analyzed makes uncovering these correlations and patterns a challenging task. This paper presents BiDAl, a prototype Java tool for log-data analysis that incorporates several Big Data technologies in order to simplify the task of extracting information from data traces produced by large clusters and server farms.’ […]
Read paper
By Alkida Balliu & Dennis Olivetti (Gran Sasso Science Institute (GSSI)), L’Aquila, Italy), Ozalp Babaoglu, Moreno Marzolla & Alina Sîrbu (Department of Computer Science and Engineering, University of Bologna, Italy)
Source: arxiv.org