Climate Adaptation Summit 2021 took place on 25 January this year (2021), to accelerate the implementation of the 2015 Paris Agreement on Climate Change. The summit started, in the opening sessions, with the urgency to accelerate the implementation of the Paris Agreement. Economy is the Environment and Education (People). Time is limited to catch up and to keep the world under the 2 degrees Celsius, a pledge that was made five years ago.
Big Data is growing unprecedentedly so rapidly that is now bypassing concepts and we have to catch up to have better insights. As large data sets are growing unprecedentedly massively, thus the name “Big Data”, development and elaboration of mathematical conceptual/theoretical frameworks will be required to catch up to turn these large data sets into actionable insights.
“The trouble is, we don’t have a unified, conceptual framework for addressing questions of data complexity…Big data without a “big theory” to go with it loses much of its usefulness, potentially generating new unintended consequences.” Geoffrey West (2013)
Edge and fog analytics are on the rise as a result of an unprecedented increase in data capture and data flow from the Internet of Things devices. These IoT devices are spreading rapidly with the expansion of cloud-to-edge new range of machine learning and artificial intelligence (ML/AI) chips that are built-in to accommodate high-performance computing (HPC) for processing streaming data.  ML and AI at the edges include deep learning (DL) and reinforcing learning (RL) to process data flow and carry out analytics on real-time. …
Big Data, as its name implies, involves enormous amount of data sets. The name “Big Data” first appeared in academic publications in the 1990s. As of 2008 it has been widely used and continued to do so with the spread of cloud infrastructure, machine learning and artificial intelligence. 
Today, there is even a more “rush to compute unlike anything we’ve ever seen before”, wrote Matt Day. Big Data is increasingly sought-for to detect hidden patterns in data since the presence of patterns in any data is an indication for possibility for prediction and for discovery. …
“Historical perspective differs from history in that the object of historical perceptive is to sharpen one’s vision of the present not the past”. Barbara S. Lawrence
Studies of historical perspective of ML can help to further sharpen its present for the future. According to Barbara Lawrence: “Historical perspective expands research horizons by encouraging study of the relative stability of phenomena, providing alternative explanations for phenomena, and aiding problem formulation and research design.” 
Big data has grown tremendously rapidly, in recent years, to an unprecedented position leading to data to attract more attention and to be used, in many different ways, than it did with table or structured data. However, it is also creating unprecedentedly new challenges including epistemic challenges.
Research on Big data identified challenges that are not only technological but also epistemological challenges related for instance to the establishment of theoretical and conceptual frameworks to scale inferences and machine learning algorithms. Sampling paradigm has been also changed under Big data as more data doesn’t inherently remove sampling bias, the volume may…
Abstraction and reasoning are elusive notions, but they may play a crucial role in artificial intelligence’ s decision making process . By analogy to human brain, artificial intelligence (AI) makes decisions by “learning” from features discovered in the data (text, image, sound, sensor, video, etc.) via deep learning, which involves large artificial neural networks (ANN) with multiple layers of connected “artificial” neurons.
OperAI develops Math and AI-based solutions to speed up and streamline operational processes at the edges of the cloud.