服务承诺
资金托管
原创保证
实力保障
24小时客服
使命必达
51Due提供Essay,Paper,Report,Assignment等学科作业的代写与辅导,同时涵盖Personal Statement,转学申请等留学文书代写。
51Due将让你达成学业目标
51Due将让你达成学业目标
51Due将让你达成学业目标
51Due将让你达成学业目标私人订制你的未来职场 世界名企,高端行业岗位等 在新的起点上实现更高水平的发展
积累工作经验
多元化文化交流
专业实操技能
建立人际资源圈Neural Mechanisms for Information Compression--论文代写范文精选
2016-01-21 来源: 51due教员组 类别: 更多范文
在过去的几年里,一直在开发一个概念性信息结构机制,由多个对齐、统一和搜索(ICMAUS),旨在整合概念计算、人工智能和认知科学。在最抽象的层面上,旨在模拟处理任何类型的系统信息,包括自然和人工。下面的paper代写范文进行详述。
Abstract
This article describes how an abstract framework for perception and cognition may be realised in terms of neural mechanisms and neural processing. This framework—called information compression by multiple alignment, unification and search (ICMAUS)—has been developed in previous research as a generalized model of any system for processing information, either natural or artificial. Applications of the framework include the representation and integration of diverse kinds of knowledge, analysis and production of natural language, unsupervised inductive learning, fuzzy pattern recognition, recognition through multiple levels of abstraction, probabilistic ‘deduction’ and abduction, chains of reasoning, nonmonotonic reasoning, ‘explaining away’, solving geometric analogy problems, and others.
A key idea in the ICMAUS framework is that information compression is important both as a means of economising on the transmission or storage of information and also as the basis of probabilistic reasoning. The proposals in this article may be seen as an extension and development of Hebb’s [1949] concept of a ‘cell assembly’. The article describes how the concept of ‘pattern’ in the ICMAUS framework may be mapped onto a version of the cell assembly concept and the way in which neural mechanisms may achieve the effect of ‘multiple alignment’ in the ICMAUS framework.
Introduction
In the last few years, I have been developing a conceptual framework—information compression by multiple alignment, unification and search (ICMAUS)—that aims to integrate concepts in computing, AI and cognitive science. At its most abstract level, the framework is intended to model any kind of system for processing information, either natural or artificial. However, much of the inspiration for this work came from established ideas about perception and cognition in humans and other animals. And the framework may be viewed as a model for natural kinds of information processing such as the analysis and production of natural language, recognition of patterns and objects despite distortion or omission of information, probabilistic kinds of reasoning, unsupervised learning, and others.
A key idea in the ICMAUS framework is that information compression is important both as a means of reducing the volume of information (leading to economies in the transmission or storage of information) and also as the basis of probabilistic reasoning. With respect to volume reductions, this idea contrasts with Barlow’s recent views [2001a, 2001b], discussed briefly in Section 8.4 towards the end of the article. In work to date, the ICMAUS framework has been developed in purely abstract terms without reference to the anatomy or physiology of neural tissue. The main purpose of this article is to consider, as far as current knowledge allows, possible ways in which the abstract concepts that have been developed within the ICMAUS framework may be mapped on to structures and mechanisms in the brain. It will be convenient to refer to these proposals as ‘ICMAUS-neural’ or ‘ICMAUS-N’. The main focus of this article is on the storage of knowledge and the way in which sensory data may connect with previously-learned knowledge in perception and learning.
These are areas where there is a relative paucity of neurophysiological evidence compared with what is now known about the processing of sensory data in sense organs, thalamus and cortex. The ICMAUS framework, together with existing knowledge about the way in which neural tissue normally works, is a source of hypotheses about the organisation and workings of the brain. To anticipate a little, it is proposed that ‘patterns’ in the ICMAUS framework are realised with structures resembling Hebb’s [1949] concept of a ‘cell assembly’. By contrast with that concept, it is proposed here that any one neuron can belong in one assembly and only one assembly. However, any assembly may contain neurons that serve as ‘references’, ‘codes’ or ‘identifiers’ for one or more other assemblies. This mechanism allows information to be stored in a compressed form, it provides a robust mechanism by which assemblies may be connected to form hierarchies and other kinds of structure, it means that assemblies can express abstract concepts, and it provides solutions to some of the other problems associated with cell assemblies.
The ICMAUS framework
The outline of the ICMAUS framework presented here is intended to provide sufficient detail for the purpose of discussing neural mechanisms without swamping the reader. A more comprehensive overview may be found in Wolff [to appear] and even more detail in articles discussing how the framework relates to mathematics and logic [Wolff, 2002a], analysis and production of language [Wolff, 2000], the nature of ‘computing’ [Wolff, 1999a], probabilistic kinds of reasoning [Wolff, 1999b, 2001a], and unsupervised learning [Wolff, 2002b]. 2.1 Information compression Given that IC is a central plank of the ICMAUS framework, a few words may be helpful about the nature of IC. The essence of IC is the extraction of redundancy from information. In terms of communication, ‘redundant’ information is ‘surplus to requirements’ because it repeats what has already been expressed.
However, redundant information can be useful for other reasons (see Section 3.3, below). There may, on occasion, be a case for discarding nonredundant information as well as redundant information—leading to ‘lossy’ IC. However, the main focus in this article will be IC by the extraction of redundancy. Mathematical treatments of IC sometimes have the effect of obscuring a very simple idea that can be seen most clearly in the simpler ‘standard’ methods for IC [see, for example, Storer, 1988] and, arguably, lies at the heart of all methods for IC: If a pattern repeats two or more times in a body of information then the information may be compressed by merging or ‘unifying’ the repeated instances of the pattern to make one. A repeated pattern is an expression of redundancy in information and unification is the means by which that redundancy is extracted. Clearly, large patterns yield more compression than small ones and frequent patterns yield more compression than rarer ones. For any given size of pattern, there is a minimum frequency for achieving compression.
Large patterns can yield compression when frequencies are low (even as small as 2) but with smaller patterns, the minimum frequency is larger. Since unification destroys information about the positions of all but one of the patterns that are unified, these positions are normally marked by some kind of relatively short ‘reference’, ‘identifier’ or ‘code’. As a general rule, patterns that occur frequently should be given shorter codes than patterns that are rare. This is the central idea in Huffman coding and related techniques [see Cover and Thomas, 1991].
Information compression, frequency and counting As we have seen, frequency is important for IC, both in terms of the number of patterns that are unified and also because frequent patterns can be given shorter codes than rare ones. But there is another connection between IC and frequency that is less widely recognised: frequency implies counting, counting implies recognition, and recognition implies IC by the unification of matching patterns. In case the last two points seem obscure, consider what is entailed in counting the number of apples in a bowl of fruit. It is not possible to do this without recognising that one apple is the ‘same’ (at some level of abstraction) as the next one or that they are all the ‘same’ as some abstract representation of the concept ‘apple’. In itself, the process of counting implies the merging or unification of several instances into a single concept. Thus counting, in itself, implies IC.
51Due网站原创范文除特殊说明外一切图文著作权归51Due所有;未经51Due官方授权谢绝任何用途转载或刊发于媒体。如发生侵犯著作权现象,51Due保留一切法律追诉权。(论文代写)
更多论文代写范文欢迎访问我们主页 www.51due.com 当然有论文代写需求可以和我们24小时在线客服 QQ:800020041 联系交流。-X(论文代写)

