Machine Learning: An Artificial Intelligence Approach (Symbolic Computation)
The idea of expressing mathematical routines constructively is widely applicable. Even the simplest routines that are of ten thought of as ''atomic''—such as sin x —can be constructed from their primitive constituent mathematical operations, that is, periodicity and symmetry of the sine function and a truncated Taylor expansion. The idea is that instead of writing a subroutine that computes the value of a function, one writes code to construct the subroutine that computes a value.
Such a formulation separates the ideas into several independent pieces that can be used interchangeably to facilitate attacking new problems. The advantages are obvious. First, clever ideas need to be coded once in a context independent of the particular application, thus enhancing the reliability of the software. Second, the code is closer to the mathematical basis of the function and is expressed in terms of the vocabulary of numerical analysis.
Third, the code is adaptable to various usages and precisions because the routine's accuracy is an integral part of the code rather than a comment that the programmer adds; just changing the number that specifies the accuracy will generate the single, double, and quadruple precision versions of a subroutine.
Writing subroutines in this style requires the support of a programming language that provides higher-order procedures, streams, and other powerful abstraction mechanisms available in functional languages. The run-time efficiency does not necessarily suffer. The extra work of manipulating the function's construction need be done only once. The actual function calls are not encumbered. Moreover, because functional programs have no side effects, they have no required order of execution.
This makes it exceptionally easy to execute them in parallel.
The current generation of computational mechanics software is based on programming concepts and languages that are two or three decades old. As attention turns to the development of the next generation of software, it is important that the new tools, concepts, and languages that have emerged in the interim be properly evaluated and that the software be built using the best appropriate tools.
Symbolic Reasoning (Symbolic AI) and Machine Learning | Skymind
Designs for finite element systems based on object-oriented concepts have begun to emerge in the literature. As these designs show, object-oriented programming, an offshoot of AI research, can have a major impact on computational mechanics software development. It is possible to raise the level of abstraction present in large-scale scientific programs i. Programs designed in this manner allow developers to reason about program fragments in terms of abstract behavior instead of implementation.
These program fragments are referred to as objects or data abstractions, their abstract quality being derived from precise specifications of their behavior that are separate and independent of implementation and internal representation details. While the bulk of today's computational mechanics production work is done by means of large comprehensive programs, there is a great deal of exploratory work requiring the development of "one-shot" ad hoc custom-built programs.
Developers of such ad hoc programs may have access to subroutine libraries for common modules or "building blocks" but not much else. These developers frequently have to reimplement major segments of complete programs in order to "exercise" the few custom components of their intended program.
One potential application of AI methodology is an expert system to assist in synthesizing computational programs tailored to particular problems, on the fly. The system would require some specifications of the program goal; the constraints e. The system's knowledge base would contain descriptions of program components with their attributes lan-. The expert system would have to use both backward and forward chaining components—the former to break down the goal into the program structure and the latter to select program components to "drive" the low-level custom components.
Combining numerical techniques with ideas from symbolic computation and with methods incorporating knowledge of the underlying physical phenomena can lead to a new category of intelligent computational tools for use in analysis. Systems that have knowledge of the numerical processes embedded within them and that can reason about the application of these processes can control the invocation and evolution of numerical solutions.
They can "see what not to compute" Abelson and take advantage of known characteristics of the problem and structure of the solution to suggest data representations and appropriate solution algorithms. The coupling of symbolic knowledge-based and numerical computing is particularly appropriate in situations where the application of pure numerical approaches does not provide the capabilities needed for a particular application. For example, numerical function minimization methods can be coupled with constraint-based reasoning methods from AI technology to successfully attack large nonlinear problem spaces where numerical optimization methods are too weak to find global minima.
To derive a solution to the problem, domain-specific knowledge about problem solving in terms of symbolic constraints guides the application of techniques such as problem decomposition, constraint propagation, relaxation, and refinement. As higher level modeling tools are built and larger modeling knowledge bases are constructed, issues such as integration, coordination, cooperative development, and customization become critical. A framework for a general finite element modeling assistant has also been proposed recently in the literature.
The framework is intended to permit a.
Symbolic Computation: Machine Learning : An Artificial Intelligence Approach (2013, Paperback)
The key feature of the framework is that the system consists of a set of core knowledge sources for the various aspects of modeling and model interpretation that use stored resources for the problem-dependent aspects of the task. In this fashion, new problem types, as well as individual organizations' approaches to modeling, only involve expansion of the resources without affecting the knowledge sources.
In the comprehensive framework envisaged, the core knowledge sources would perform the function of model generation and interpretation and program selection with possible customization and synthesis and invocation. The three major resources used by these knowledge sources are as follows:. These represent an extended taxonomy or semantic network of the various classes of physical systems amenable to finite element modeling and the assumptions appropriate for each class.
Their purpose is to provide pattern matching capabilities to the knowledge sources so that the definition of problem class and key problem parameters can be used by the knowledge sources in their tasks at each level of abstraction. The major design objective in developing these taxonomies will not be to avoid exhaustive enumeration of individual problems to be encountered, but rather to build a multilevel classification of problem types based on their functionality, applicable assumptions, behavior, failure modes, analysis strategies, and spatial decompositions.
It is also expected that a large part of knowledge acquisition can be isolated into modifying these taxonomies either by specialization customization to individual organization or generalization merging or pooling knowledge of separate organizations. In a manner similar to the above, these taxonomies represent the capabilities, advantages, and limitations of analysis programs.
The taxonomy must be rich enough so that the knowledge source invoking the programs can make recommendations on the appropriate program s to use based on the high-level abstractions generated by the other knowledge sources or, if a particular program is not available in the integrated system, to make recommendations on alternate modeling strategies so that the available program s can be effectively and efficiently used. As with the previous taxonomy, the program capability taxonomy needs to be designed so that knowledge acquisition about additional programs can be largely isolated to the expansion of the taxonomy data base.
The programs, including translators to and from neutral files as needed, are isolated in the design to serve only as resources to solve the model. The issues in this interconnection are largely ones of implementation in coupling numerical and knowledge-based programs. Modern computing environments make such coupling relatively seamless. This section discusses briefly two important problems that must be addressed before reliable modeling environments, such as the one discussed above, can be built. The first problem is the need to provide more flexibility to knowledge-based systems, and the second is the need to compile a core of modeling assumptions.
The present generation of knowledge-based systems has been justly criticized on three grounds: Present knowledge-based systems are brittle—in the sense used in computer science as a contrast to "rugged" systems—in that they work in a very limited domain and fail to recognize, much less solve, problems falling outside of their knowledge base. In other words, these systems do not have an explicit representation of the boundaries of their expertise. Therefore, there is no way for them to recognize a problem for which their knowledge base is insufficient or inappropriate.
Rather than exhibiting "common sense reasoning" or "graceful degradation," the systems will blindly attempt to ''solve" the problem with their current knowledge, producing predictably erroneous results. Current research on reasoning from first principles will help overcome this problem. Combining first principles with specialized rules will allow a system to resort to sound reasoning when few or no specialized items in its knowledge base cover a situation.
Machine learning - an artificial intelligence approach
First principles can also be used to check the plausibility of conclusions reached by using specialized knowledge. A KBES developed using the present methodology is idiosyncratic in the sense that its knowledge base represents the expertise of a single human domain expert or, at best, that of a small group of domain experts.
The system thus reproduces only the heuristics, assumptions, and even style of. The nature of expertise and heuristics is such that another, equally competent expert in the domain may have different, or even conflicting, expertise. However, it is worth pointing out that a KBES is useful to an organization only if it reliably reproduces the expertise of that organization.
At present, there appear to be no usable formal methods for resolving the idiosyncratic nature of KBESs. There are some techniques for checking the consistency of knowledge bases, but these techniques are largely syntactic. One practical approach is to build a domain-specific metashell that contains a common knowledge base of the domain and excellent knowledge acquisition facilities for expansion and customization by a wide range of practitioners.
Present KBESs are static in two senses. First, the KBES reasons on the basis of the current contents of its knowledge base; a separate component, the knowledge acquisition facility, is used to add to or modify the knowledge base. Second, at the end of the consultation session with a KBES, the context is cleared so that there is no provision for retaining the "memory" of the session e. Research on machine learning is maturing to the point where knowledge-based systems will be able to learn by analyzing their failed or successful performance—an approach called explanation-based learning.
One task of great practical payoff is the development of a knowledge base of modeling assumptions that contains what is believed to be the shared knowledge of analysts. Such a core knowledge base will be beneficial in two important ways. First, it could be used as a starting point to build a variety of related expert systems, hence making the development cycle shorter. Second, such a knowledge base could become the "corporate memory" of the discipline and, hence, could give insights into the nature of the various aspects of modeling knowledge.
One starting point to build such a knowledge base is to "reverse engineer" existing models to recognize and extract their assumptions. Two useful precedents from other domains offer guidance. Cyc is a large-scale knowledge base intended to encode knowledge spanning human consensus reality down to some reasonable level of depth—knowledge that is assumed to be shared between people communicating in everyday situations. Cyc is a year effort that started in Progress to date indicates that the already very-large knowledge base millions of assertions is not diverging in its semantics and already can operate in some common situations.
Knowledge-based Emacs KBEmacs is a programmer's apprentice. KBEmacs extends the well-known text editor Emacs with facilities to support programming activities interactively.
Special order items
The knowledge base of KBEmacs consists of a number of abstract programs cliches ranging from very simple abstract data types such as lists to abstract notions such as synchronization and complex subsystems such as peripheral device drivers. The fundamental idea is that the knowledge base of cliches encode the knowledge that is believed to be shared by programmers.
KBEmacs has been used successfully to build medium-sized programs in Lisp and Ada. The objective of this appendix was to present some of the concepts and methodologies of AI and examine some of their potential applications in various aspects of computational mechanics. The methodologies sketched herein are maturing rapidly, and many new applications in computational mechanics are likely to be found. Undoubtedly, AI methodologies will eventually become a natural and integral component of the set of computer-based engineering tools to the same extent as present-day "traditional" algorithmic tools.
These tools will then significantly elevate the role of computers in engineering from the present-day emphasis on calculation to the much broader area of reasoning. Computational mechanics is a scientific discipline that marries physics, computers, and mathematics to emulate natural physical phenomena. It is a technology that allows scientists to study and predict the performance of various products--important for research and development in the industrialized world.
This book describes current trends and future research directions in computational mechanics in areas where gaps exist in current knowledge and where major advances are crucial to continued technological developments in the United States. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.
Jump up to the previous page or down to the next one. A Theory and Methodology of Inductive A Index of Rules. Inferring Student Models for Intelligent. Acquisition of Proof Skills in Geometry. Using Proofs and Refutations to Learn from.
Bookseller Completion Rate
A Multistrategy Approach Ryszard S. Best of all, it's free. Did you know that since , Biblio has used its profits to build 12 public libraries in rural villages of South America? Biblio is a marketplace for book collectors comprised of thousands of independent, professional booksellers, located all over the world, who list their books for sale online so that customers like you can find them! When you place your order through Biblio, the seller will ship it directly to you. This reflects the percentage of orders the seller has received and filled. Stars are assigned as follows:. Inventory on Biblio is continually updated, but because much of our booksellers' inventory is uncommon or even one-of-a-kind, stock-outs do happen from time to time.
If for any reason your order is not available to ship, you will not be charged.
Your order is also backed by our In-Stock Guarantee! What makes Biblio different? Sign In Register Help Cart.