Read about the assigned robotic terminology and post your findings in this blog.
George: Turing test
Emanu: Computional intelligence
Sohail: Fuzzy logic
William:AI searching technique
Aarfa:AI pattern recognition techniques
Fatema:AI machine Learning Techniques
Jamal: AI heuritics techniques
Jamil : Neural networks
Sadhu: Natural language communication and AI
Sheehan:AI pattern recognition techniques
Caleb:Neural Networks
Solitei:Fuzzy logic
18 comments:
Heuristics
Heuristic refers to experience-based techniques used in problem solving learning, and discovery. Where a thorough search is impractical, heuristic methods are used to speed up the process of finding a satisfactory solution. Whereby, every search process can be viewed as a traversal of a directed graph, in which the nodes represent problem states and the arcs represent relationships between states. The search process must find a path through this graph, starting at an initial state and ending in one or more final states. In general, heuristic search improve the quality of the path that are exported
In computer science, a heuristic is a technique designed to solve a problem that ignores whether the solution can be proven to be correct, but which usually produces a good solution or solves a simpler problem that contains or intersects with the solution of the more complex problem. Heuristics in computer science are intended to gain computational performance or conceptual simplicity, potentially at the cost of accuracy or precision. In artificial intelligence, a heuristic method can accomplish its task by using search trees. However, instead of generating all possible solution branches, a heuristic selects branches more likely to produce outcomes than other branches. It is selective at each decision point, picking branches that are more likely to produce solutions.
- Jamal
http://en.wikipedia.org/wiki/Heuristic#Human-computer_interaction
artificialintelligence-notes.blogspot.com/.../heuristic-search-technique..
Pattern recognition aims to classify data (patterns) based on either a priori knowledge or on statistical information extracted from the patterns. The patterns to be classified are usually groups of measurements or observations.
A complete pattern recognition
system consists of a sensor that gathers the observations to be classified or described; a feature extraction mechanism that computes numeric or symbolic information from the observations; and a classification or description scheme that does the actual job of classifying or describing observations, relying on the extracted features.
The classification or description scheme usually uses one of the following approaches: statistical (or decision theoretic), syntactic (or structural), or neural. Statistical pattern recognition is based on statistical characterizations of patterns, assuming that the patterns are generated by a probabilistic system. Structural pattern recognition is based on the structural interrelationships of features. Neural pattern recognition employs the neural computing paradigm that has emerged with neural networks.
Important application areas are image analysis, character recognition, speech analysis, man and machine diagnostics, person identification and industrial inspection.
http://aaai.org/AITopics/PatternRecognition
Neural Networks
Neural Networks are an information processing technique based on the way biological nervous systems, such as the brain, process information. The fundamental concept of neural networks is the structure of the information processing system. Composed of a large number of highly interconnected processing elements or neurons, a neural network system uses the human-like technique of learning by example to resolve problems. Just as in biological systems, learning involves adjustments to the synaptic connections that exist between the neurons. Neural networks can differ on: the way their neurons are connected, the way they transmit patterns of activity throughout the network and the way they learn, including their learning rate. Neural networks are being applied to an increasing large number of real world problems. Their primary advantage is that they can solve problems that are too complex for conventional technologies, problems that do not have an algorithmic solution or for which an algorithmic solution is too complex to be defined. In general, neural networks are well suited to problems that people are good at solving, but for which computers generally are not. These problems include pattern recognition and forecasting; which requires the recognition of trends in data.
http://www.pcai.com/web/ai_info/neural_nets.html
FUZZY LOGIC
What is Fuzzy Logic?
Fuzzy Logic is basically a multivalued logic that allows intermediate values to be defined between conventional evaluations like yes/no, true/false, black/white, etc. Fuzzy logic gives us a way to deal with such situations. In fuzzy systems, values are indicated by a number (called a truth value) in the range from 0 to 1, where 0.0 represents absolute falseness and 1.0 represents absolute truth.
Fuzzy logic will enable notions such as “rather cold” or “pretty hot” to be formulated by a computer system. As opposed to the ordinary computer logic which would simply state whether conditions would be hot or cold. The fuzzy logic takes computer logic to a whole new level by measuring the intensity of the condition and acting accordingly if necessary – in this case the condition would be measured by how hot the condition being measured is on a scale of 0 to 1.
Application Of Fuzzy Logic:
Fuzzy logic controls household appliances such as washing machines, which sense load size and detergent concentration and adjust their wash cycles accordingly
-----------------------
By SOHAIL SHARIFF
http://aaai.org/AITopics/FuzzyLogic
Neural Networks:
An artificial neural network is an information processing model that is inspired by the way different biologicial nervous systems work. Artificial Neural Networks are based on the way the nervous systems of the body process information e.g how the brain processes information. The main and most important elements of a Neural Network is it’s structure and the way it processes informaiton.
A Neural Network is made up of many interconnected processing elements known as Neurones. These work together in unison in order to solve specific problems. What makes an ANN so similar to a human is the fact that it works by learning from example. It is configured for a specific application through a learning process. Therefore, you actually show the ANN how to perform the specific task. Just as you would show a human,.
Just like in a biological system, the ANN involves adjustment to synaptic connections that exist between the neurones. This is how both biological systems and ANN’s learn how to do things.
Neural Networks have the ability to derrive meaning from complication or imprecise data. Due to this ability they are also able to extract patterns or detect trends that are too complex to be noticed by humans or other computing techniques.
Once a neural network has been trained in a particular aspect it can be seen as an expert in that catergory it has been taught to analyze.
The advantages of Artificial Neural Networks include:
1. Adaptive learning: The ability to learn how to do tasks based on the data given for training or initial experience.
2. Self-Organisation: An ANN can create its own organisation or representation of the information it receives during learning time.
3. Real Time Operation: ANN computations may be carried out in parallel, and special hardware devices are being designed and manufactured which take advantage of this capability.
4. Fault Tolerance via Redundant Information Coding: Partial destruction of a network leads to the corresponding degradation of performance. However, some network capabilities may be retained even with major network damage. (Stergiou and Siganos, 2011)
Bibliography:
Stergiou, C., & Siganos, D. (n.d.). Neural Networks. Computing. Retrieved February 21, 2012, from http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#What%20is%20a%20Neura
Artificial intelligence
Artificial Intelligence, otherwise known as AI, is the study and development of intelligent machines capable of performing complex tasks that require thought and behavior normally associated with human intelligence.
Search in Artificial Intelligence
Search plays a major role in solving many Artificial Intelligence (AI) problems. Search is a universal problem-solving mechanism in AI. In many problems, sequence of steps required to solve is not known in advance but must be determined by systematic trial-and-error exploration of alternatives. The problems that are addressed by AI search algorithms fall into three general classes:
single-agent path-finding problems, two-players games, and constraint-satisfaction problems
Single-agent path-finding problems
Classic examples in the AI literature of path-finding problems are sliding-title puzzles, Rubik’s Cube and theorem proving. The single-title puzzles are common test beds for research in AI search algorithms as they are very simple to represent and manipulate. Real-world problems include the traveling salesman problem, vehicle navigation, and the wiring of VLSI circuits. In each case, the task is to find a sequence of operations that map an initial state to a goal state.
Two-players games
Two-players games are two-player perfect information games. Chess, checkers, and othello are some of the two-player games.
Constraint Satisfaction Problems
Eight Queens problem is the best example. The task is to place eight queens on an 8*8 chessboard such that no two queens are on the same row, column or diagonal. Real-world examples of constraint satisfaction problems are planning and scheduling applications..
Computational Intelligence
Computational intelligence (CI) is a set of Nature-inspired computational methodologies and approaches to address complex problems of the real world applications.
CI is applied when the traditional methodologies and approaches, e.g. first principles, probabilistic, black-box, which have primarily been in use, are ineffective. Being more powerful than AI, CI generally focuses on problems thatonly humans and animals can solve (intelligent species), as this is the main difference between them.
Unlike Artificial Intelligence, Computational Intelligence is used to explain the occurence of events via reasoning, judgements, and analysis using existing data (KDD), rather than making decisions or performing tasks.
It primarily includes Fuzzy logic systems, Neural Networks and Evolutionary Computation. In addition, CI also embraces techniques that stem from the above three or gravitate around one or more of them, such as Swarm intelligence and Artificial immune systems which can be seen as a part of Evolutionary Computation; Dempster-Shafer theory, Chaos theory and Multi-valued logic which can be seen as off-springs of Fuzzy Logic Systems, etc.
Computational Intelligence
Computational intelligence (CI) is a set of Nature-inspired computational methodologies and approaches to address complex problems of the real world applications.
CI is applied when the traditional methodologies and approaches, e.g. first principles, probabilistic, black-box, which have primarily been in use, are ineffective. Being more powerful than AI, CI generally focuses on problems thatonly humans and animals can solve (intelligent species), as this is the main difference between them.
Unlike Artificial Intelligence, Computational Intelligence is used to explain the occurence of events via reasoning, judgements, and analysis using existing data (KDD), rather than making decisions or performing tasks.
It primarily includes Fuzzy logic systems, Neural Networks and Evolutionary Computation. In addition, CI also embraces techniques that stem from the above three or gravitate around one or more of them, such as Swarm intelligence and Artificial immune systems which can be seen as a part of Evolutionary Computation; Dempster-Shafer theory, Chaos theory and Multi-valued logic which can be seen as off-springs of Fuzzy Logic Systems, etc.
Turing Test
Turing test is a method which was proposed by Alan Turing and is used to determine whether a computer is capable of exhibiting intelligent behaviors. As proposed by the British philosopher, a remote human interrogator, within a fixed period of time has to discriminate between a computer and a human based on their responses to questions posed by the interrogator. By analysing their responses, the human interrogator attempts to determine which correspondent is a human and which is a computer. Generally, the computer is programmed to emulate humans by giving occasional mistakes and pausing slightly before giving its response similar to what a human might do in some instances. The interrogator would not expect the machine to make such silly mistakes. If it proves impossible for the interrogator to differentiate between the human and the computer, the computer is endorsed to having passed the test.
References
http://www.answers.com/topic/turing-test#ixzz1mzYUrlsn
http://plato.stanford.edu/entries/turing-test/
http://psych.utoronto.ca/users/reingold/courses/ai/turing.html
http://ezinearticles.com/?Artificial-Intelligence-and-the-Turing-Test&id=6631632
Turing Test
Turing test is a method which was proposed by Alan Turing and is used to determine whether a computer is capable of exhibiting intelligent behaviors. As proposed by the British philosopher, a remote human interrogator, within a fixed period of time has to discriminate between a computer and a human based on their responses to questions posed by the interrogator. By analysing their responses, the human interrogator attempts to determine which correspondent is a human and which is a computer. Generally, the computer is programmed to emulate humans by giving occasional mistakes and pausing slightly before giving its response similar to what a human might do in some instances. The interrogator would not expect the machine to make such silly mistakes. If it proves impossible for the interrogator to differentiate between the human and the computer, the computer is endorsed to having passed the test.
References
http://www.answers.com/topic/turing-test#ixzz1mzYUrlsn
http://plato.stanford.edu/entries/turing-test/
http://psych.utoronto.ca/users/reingold/courses/ai/turing.html
http://ezinearticles.com/?Artificial-Intelligence-and-the-Turing-Test&id=6631632
AI Machine Learning Techniques
Machine Learning is creating a new way of thinking about our world and about us.
Supervised learning is fairly common in classification problems because the goal is often to get the computer to learn a classification system that we have created, for example digit recognition. More generally, classification learning is appropriate for any problem where deducing a classification is useful and the classification is easy to determine. Supervised learning is the most common technique for training neural networks and decision trees. Both of these techniques are highly dependent on the information given by the pre-determined classifications.
Unsupervised learning seems much harder: the goal is to have the computer learn how to do something that we don't tell it how to do! There are actually two approaches to unsupervised learning. The first approach is to teach the agent not by giving explicit categorizations, but by using some sort of reward system to indicate success.
Often, a form of reinforcement learning can be used for unsupervised learning, where the agent bases its actions on the previous rewards and punishments without necessarily even learning any information about the exact ways that its actions affect the world.
A second type of unsupervised learning is called clustering. In this type of learning, the goal is not to maximize a utility function, but simply to find similarities in the training data. The assumption is often that the clusters discovered will match reasonably well with an intuitive classification. For instance, clustering individuals based on demographics might result in a clustering of the wealthy in one group and the poor in another.
http://www.aihorizon.com/essays/generalai/supervised_unsupervised_machine_learning.htm
The two major divisions of pattern recognition are machine vision and sound.
A computer "sees" a two-dimensional grid of pixels(short for "picture element"; one of thousands of points on a computer screen from which digital images are formed) with varying colors and degrees of brightness based on numerical values.Image formation is the most technically developed stage of machine vision. A camera records the amount of light reflected into it from the surfaces of objects in a three-dimensional scene. The information is then transmitted through a converter that changes the analog signals into digital information that the computer can interpret.
The main focus in AI when it comes to sound-processing is to make a computer that can recognize what a person says to it. The reason why this is done as opposed to making a computer recognize the sound of a car or the sound of a telephone ring is because 1) there usually is something meaningful when someone talks and 2) making a computer capable of automated speech recognition(ASR) would be a next step in man-machine interface(MMI).As a branch of AI, the importance of ASR and speech synthesis lies in the development of pattern-recognition programs that understands the bits of data that compose the message. Some of the early technologies from this field have found their way into the applications market, but they still need to be refined in order for a computer to communicate intelligently and naturally like people.
Fuzzy logic deals with reasoning that is approximate rather than fixed and exact. Fuzzy logic is built up of many facets (aspects). These facets include:
1. The fuzzy-set-theoretic facet, FLs
2. The logical facet, FLl
3. The epistemic facet, FLe
4. The relational facet, FLr
An example of fuzzy logic for a simple temperature regulator that uses a fan might look like this:
-If temperature IS very cold THEN stop fan
-If temperature IS cold THEN turn down fan
-If temperature IS normal THEN maintain level
-If temperature IS hot THEN speed up fan
Neuro fuzzy is a field of artificial intelligence which was proposed by J. S. R. Jang to help machines attain fuzzy knowledge. This field of AI is a combination of both artificial neural networks and fuzzy logic.
References:
http://www.scholarpedia.org/article/Fuzzy_Logic
http://www.learnartificialneuralnetworks.com/fuzzy-logic.html
A computer "sees" is a two-dimensional grid of pixels(short for "picture element"; one of thousands of points on a computer screen from which digital images are formed) with varying colors and degrees of brightness based on numerical values.Image formation is the most technically developed stage of machine vision. A camera records the amount of light reflected into it from the surfaces of objects in a three-dimensional scene. The information is then transmitted through a converter that changes the analog signals into digital information that the computer can interpret.
The main focus in AI when it comes to sound-processing is to make a computer that can recognize what a person says to it. The reason why this is done as opposed to making a computer recognize the sound of a car or the sound of a telephone ring is because 1) there usually is something meaningful when someone talks and 2) making a computer capable of automated speech recognition(ASR) would be a next step in man-machine interface(MMI).
An aspect of sound-processing research that has made faster progress than ASR is speech synthesis. Taking the knowledge from ASR such as phonemes and such, speech synthesizers have become relatively successful in generating understandable words and sentences.As a branch of AI, the importance of ASR and speech synthesis lies in the development of pattern-recognition programs that understands the bits of data that compose the message.
Post a Comment