Big Data Architectures

As data increase in size,velocity and variety,new computer technologies becomes necessary .These new technologies,which include hardware and software,must be easily expanded as more data are processed. this property is known as Scalability

Even,if processing power is expanded by combining several computers in a cluster,creating a distributed system,conventional software for distributed systems usually cannot cope with big data.

One of the limitations is the efficient distribution of data among the different processing and storage deal with these requirements,new software tools & techniques have been developed.

MapReduce is a programming model which is divided into parts — chunks — and stores in the memory of each cluster computer the cluster of the data set needed by this computer to accomplish its processing task.