Nowadays, one of the most popular topics is "Artificial Intelligence". Artificial Intelligence can be defined as the transfer of the skills of “reasoning, making sense, generalization and learning from past experiences” attributed to human intelligence to computers.
As a scientific study discipline, artificial intelligence includes many approaches and techniques such as machine learning (deep learning and reinforcement learning are some examples), information extraction and inference engines (it includes tasks such as planning, scheduling, presenting information, searching, optimization) and robotics (it covers the integration of control, detection, sensors, stimuli and other techniques into cyber-physical systems).
Machine Learning is the general name of computer algorithms that model a given problem according to the data obtained from the problem environment. These algorithms are examined under the headings of prediction, estimation, and classification.
Machine Learning can be classified in three ways according to whether the values / labels of the outputs of the input data are known: Supervised Learning, Unsupervised Learning, Reinforced Learning.
Supervised Learning; there is information about what output should be compared to input. In this approach, an attempt is made to find a function that connects input and output. The most common applications are: Regression and Classification.
Unsupervised Learning; input data is available and with these data, discovery of hidden relationships is aimed over it. It is not known what the output is. The best known examples are: Clustering and Togetherness practices.
Reinforced Learning (for a system consisting of an agent and an environment); is the discovery of the optimization function as a result of the interaction of the agent and the environment.
Data mining; is the process of obtaining meaningful and useful information from the data. This information can be rules, relationships, associations and mathematical models.
Today, the increase in the amount of data, the rapid growth in the data collecting and storage capacity have led humanity to new searches. More data is produced than the amount of data that a computer and a person can process. As a result of the rapid increase of data, new techniques were required for an effective data analysis. For this purpose, Data Mining was born for identification and Prediction.
There are two types of models: descriptive and predictive models.
Optimization is the process of obtaining the best possible alternative solutions under certain conditions. Cheapest, lightest, longest life, shortest duration etc. can be selected as the purpose function.
In a typical optimization problem; there are variables representing the problem called decision variables, constraints that determine the limits of change of decision variables, constants that are invariable of the problem, goal function consisting of decision variables and constants.
With these definitions; we define optimization as the process of finding the global solution that provides the constraints given of each decision variable and takes the best target function value among the possible alternative values.
The algorithm that produces a reasonable, close to optimum solution of problems but does not give definitive results about whether these solutions are optimal is an intuitive algorithm. The intuitive algorithm is designed to deliver near-optimal solutions. The intuitive algorithm is used when the optimum solution method of the problem is unknown or the solution methods are not economical. While optimization techniques guarantee the best solution, it may take a long time to reach a solution for large-scale problems. But; intuitive techniques do not guarantee the best solution, but they offer solutions in a short time. Branch bound algorithm and dynamic programming are examples of optimization techniques. Intuitive techniques can be classified as artificial intelligence techniques, artificial neural networks and meta intuitives (genetic algorithm, taboo search algorithm, stimulated annealing etc.).
Some algorithms used in Intuitive Computing: Taboo Search, Ant Colony, Simulated Annealing, Evolutionary Computing and Genetic Algorithms.
Our Artificial Intelligence Applications
Optimizations: Distribution Problems
Regression: Electricity Load Demand Forecasting, Stock Forecasting
Classification: Turkish Natural Language Processing, Image Classification, Pattern Recognition
Data Mining: Classification, Regression, Clustering, Association analysis
Social Media Tracking, Analysis Source and Anomaly Detection
CDR Analysis and Relationship / Anomaly Detection
Extraction of usage features with CDR Analysis
Singular, Group and Conditional Anomaly Detection in Big Data
Pattern and Anomaly Detection in Streaming Data
Space Temporal Prediction / Detection Systems
Estimation with the Event Pattern
Document Summarization, Subject Determination, Classification
Emotion Analysis and Polarization Detection