The 2008. Accordingly, the notion of Big

The word “Big Data” was introduced for the first time into the computing world by Roger Magoulas of O’Reilly Media in 2005. The goal was to describe an exquisite amount of statistics that conventional statistics management techniques cannot handle because of their complexity and their sizes. The evolution of massive data as studies and scientific subjects shows that the word “Big Data” was found in studies before 1970 but integrated various courses in 2008. Accordingly, the notion of Big Data is treated from different factors and is involved in many areas (Elena, Florina, Anca and Manole, 2012). However, the term “Big Data” is not because that it has been used in many research papers for quite some time:
? In 1997, it was used to visualize gigantic statistical units (Cox and Ellsworth, 1997).
? In 1998, it was used in a material-related presentation (Mashey, 1998), as well as in the context of information extraction (Weiss and Indurkhya, 1998).
? In 2003, aggregated facts (Diebold, 2003). The year 2001 can be considered as a major step in the definition of Big Data (Ossi and Jari, 2016, p.73).
Definitions
• Big Data is a word that describes the generalized ability of data generated through the use of digital tools and information systems. The word Big Data is used when the amount of data a business needs to handle requires new technological methods in terms of storage, processing and use. These volumes of data are often so important that they go beyond the capabilities of computer analysis and management. To refer to it, we are talking about “Big Data” (Ossi and Jari, 2016).
There are other definitions of Big Data. First, Mike stated that the open source standard for information management described Big Data by its size, including a massive, complex and independent series of information units, each with the potential to intermingle .

• But, according to Gartner’s IT Glossary, Big Data is defined as “large, fast, and varied information resources that require cost-effective and innovative forms of information processing for decision-making and decision-making” (Elena, Florina, Anca and Manole, 2012).
• From Ed Dumbill’s point of view, Big Data can be defined as a record that exceeds the processing capacity of conventional database systems (Elena, Florina, Anca and Manole, 2012). Furthermore, according to a study conducted by Elena, Florina, Anca and Manole, 2012), IBM defined Big Data using 4V, namely: Volume, Speed, Variety and Veracity.
Volume: This is the amount of statistics generated by using a business. These statistics must be used more to acquire a critical understanding of the data (Elena, Florina, Anca and Manole, 2012).
Velocity: refers to the time at which Big Data can be processed and managed (Elena, Florina, Anca and Manole, 2012).
Variety: refers to the type of data that can be included in important facts. It refers to the characteristic of data acquisition, data representation and interpretation (Elena, Florina, Anca and Manole, 2012).
Veracity: refers to the stage at which a manager trusts the information used to make a decision (Elena, Florina, Anca and Manole, 2012).
A last v identified as a value was added to the Big Data definition. This is the added value that the data can bring to the business. To conclude, Other research conducted by (Ossi and Jari, 2016) has found more than a few definitions of Big Data through exclusive researchers. As a result, the report concluded that most of the definitions obtained during the study related at least in part to 3V (volume, variety, veracity).See Appendix 1 for the 5V of Big Data.
In another report conducted by IDC, Big Data describes the continuous increase of data, and the technologies needed to gather, oversee, manage and analyze them.From an innovation point of view, Big Data incorporates the foundation and programming that coordinates, composes, oversees, investigates, and introduces information. Furthermore, IDC defines Big Data as “a set of technologies, tools, processes and procedures that enable an organization to create, manipulate and manage very large amounts of varied data to facilitate rapid decision-making” (Lamour, S. ; Slaoui, Y, 2016).