服务承诺





51Due提供Essay,Paper,Report,Assignment等学科作业的代写与辅导,同时涵盖Personal Statement,转学申请等留学文书代写。




私人订制你的未来职场 世界名企,高端行业岗位等 在新的起点上实现更高水平的发展




Noise and Complexity--论文代写范文精选
2016-02-04 来源: 51due教员组 类别: Essay范文
在这种情况下我们是通过猜测的原因,这是目前无法解释的,由于其复杂性,在许多混乱的过程中,光可以产生数据。其次,我们可以决定去寻找不太特定的模型。下面的essay代写范文进行详述。
Abstract
The above account illustrates the importance of making a judgement as to what in the data may be considered as noise. If we take noise as what is ef fectively unpredictable in a pattern, then we will see that different approaches and conceptions of noise naturally emerge given different responses to excess error. Let us imagine we are faced with an unacceptable level of error in the predictions of our best current model. What could be done? Firstly, we could search for more accurate models by widening the search to look at other equally precise models. It is sensible to try the easier models first, but if we exhaust all the models at our current level of complexity we will be forced to try more complex ones. In this case we are ef fectively discounting the case that the unexplained elements of the data are unpredictable and treating noise as what is merely currently unexplained due to its complexity.
This is a view taken in the light of many chaotic processes which can produce data indistinguishable from purely random data. Secondly, we could decide to look for models that were less specific. This might allow us to find a model that was not significantly more complex but had a lower level of predictive error. Here we are essentially filtering out some of the data, attributing it to some irrelevant source. This might correspond to a situation where you know that there is an essentially random source of noise that has been imposed upon the data. This is the traditional approach, used in a wide range of fields from electronics to economics. Thirdly, and most radically, we could seek to change our language of modelling to one that we felt was more appropriate to the data. Here we have noise as the literally indescribable. for example, sometimes a neural network (NN) is set up so that extreme fluctuations in the data are not exactly capturable by the range of functions the NN can output. In this way the NN is forced to approximate the training data and overfitting is avoided. Thus randomness may be a sufficient characterisation of noise but it is not a necessary one.
Complexity vs. Information
The above framework distinguishes between the complexity of the model form and its specificity . The specificity of a model has been characterised in many ways, including: the information a model provides, the system’s entropy, and the model’s refutability. Such measures of specificity have often been linked to a model’ s simplicity, where by simplicity we mean that property of a model which makes it more likely to be true than another, given that they have equal evidential support. This property is called “simplicity”, because it is traced back to the principle of parsimony attributed to W illiam of Occam.
Thus Popper characterises simplicity as a model’ s refutability [11], while Sober has associated it with the minimum extra information to answer a given question [14]. This tradition has been continued by several authors who have used various measures of information to capture it including Shannon information and algorithmic information 3 . It is clear that such simplicity is not necessarily the opposite of complexity, as described above (see section 8). That complexity is not rigidly linked to the specificity of a model can be shown by considering any modelling language which has terms explicitly denoting nonspecificity (frequently called “error terms”). Clearly, the introduction of such terms can make an expression simultaneously more complex and less specific. This is not to say that there might not be good reasons to prefer a model which is more specific, just that it is neither directly linked to either a model’s complexity or its error rate. Rissanen makes a case for a particular trade-off between the specificity of a model and its complexity - namely that one should seek the size of the minimal description which includes the model and the deviations from the model.
Complexity and Induction
From this framework it is clear that a lack of complexity is, in general, not a reliable guide to a model's error rate4 , unless the problem and modelling language happen to be constructed that way5 . On the other hand experience has frequently shown us that the less complex theory often turns out to be more useful. The answer to this riddle becomes clear when one takes into account the process whereby models are typically developed. An ideal modeller without significant resource restrictions might well be able to attempt a fairly global search through possible model forms.
Some automatic machine-based systems approximate this situation and there it does indeed seem to be the case then that a lack of complexity is no guide to truth (e.g. [13]) - in fact, if anything the reverse seems to be true since there are typically many more complex expressions than simpler ones. Usually, however, and certainly in the case of human modellers they do not have this luxury . They can check only a very limited number of the possible model forms. Fortunately, it is frequently the case that the meaning of the models allows us to intelligently develop and combine the models we have, to produce new models that are much more likely to produce something useful than a typical automatic procedure. Thus it is frequently the case that it is sensible to try elaborations of known models first before launching off into unknown territory where success is, at best, extremely uncertain. On its own elaboration is, of course, an inadequate strategy. One can get into a position of diminishing returns where each elaboration brings decreasing improvements in the error rate, but at increasing cost.
At some stage preferring simpler and more radically dif ferent models will be more ef fective. Thus sometimes choosing the simpler model, even if less precise and accurate is a sensible heuristic, but this is only so given our knowledge of the process of theory elaboration that frequently occurs.
Conclusion
Complexity is usefully distinguished from both the probability of correctness (the error) and the specificity of the model. It is relative to both the type of dif ficulty one is concerned with and the language of modelling. Complexity does not necessarily correspond to a lack of “ simplicity” or lie between order and disorder. When modelling is done by agents with severe resource limitations, the acceptable trade-of fs between complexity, error and specificity can determine the ef fective relations between these. The characterisation of noise will emerge from this. Simpler theories are not a priori more likely to be correct but sometimes if one knows that the theories are made by an agent, for whom it is easier to elaborate than engage in a wider search, preferring the simpler theory at the expense of accuracy can be a useful heuristic.
51Due网站原创范文除特殊说明外一切图文著作权归51Due所有;未经51Due官方授权谢绝任何用途转载或刊发于媒体。如发生侵犯著作权现象,51Due保留一切法律追诉权。
更多essay代写范文欢迎访问我们主页 www.51due.com 当然有essay代写需求可以和我们24小时在线客服 QQ:800020041 联系交流。-X
