首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 187 毫秒
1.
Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some problems: it is still sensitive to initial clustering centers and the clustering results are not good when the tested datasets with noise are very unequal. An improved kernel possibilistic fuzzy c-means algorithm based on invasive weed optimization(IWO-KPFCM) is proposed in this paper. This algorithm first uses invasive weed optimization(IWO) algorithm to seek the optimal solution as the initial clustering centers, and introduces kernel method to make the input data from the sample space map into the high-dimensional feature space. Then, the sample variance is introduced in the objection function to measure the compact degree of data. Finally, the improved algorithm is used to cluster data. The simulation results of the University of California-Irvine(UCI) data sets and artificial data sets show that the proposed algorithm has stronger ability to resist noise, higher cluster accuracy and faster convergence speed than the PFCM algorithm.  相似文献   

2.
Ontology is the conceptual backbone that provides meaning to data on the semantic web. However, ontology is not a static resource and may evolve over time, which often leaves the meaning of data in an undefined or inconsistent state. It is thus very important to have a method to preserve the data and its meaning when ontology changes. This paper proposed a general method that solves the problem using data migration. It analyzed some of the issues in the method including separation of ontology and data, migration specification, migration result and migration algorithm. The paper also instantiates the general mothod in RDF(S) as an example. The RDF(S) example itself is a simple but complete method for migrating RDF data when RDFS ontology changes.  相似文献   

3.
Objective To investigate various data message of the stator bars condition parameters under the condition that only a few samples are available, especially about correlation information between the nondestructive parameters and residual breakdown voltage of the stator bars. Methods Artificial stator bars is designed to simulate the generator bars. The partial didcharge(PD) and dielectric loss experiments are performed in order to obtain the nondestructive parameters, and the residual breakdown voltage acquired by AC damage experiment. In order to eliminate the dimension effect on measurement data, raw data is preprocessed by centered-- compress. Based on the idea of extracting principal components, a partial least square (PLS) method is applied to screen and synthesize correlation information between the nondestructive parameters and residual breakdown voltage easily. Moreover, various data message about condition parameters are also discussed. Results Graphical analysis function of PLS is easily to understand various data message of the stator bars condition parameters. The analysis Results are consistent with result of aging testing. Conclusion The method can select and extract PLS components of condition parameters from sample data, and the problems of less samples and malticollinearity are solved effectively in regression analysis.  相似文献   

4.
This paper investigates the Web data aggregation issues in multidimensional on-line analytical processing (MOLAP) and presents a rule-driven aggregation approach. The core of the approach is defining aggregate rules. To define the rules for reading warehouse data and computing aggregates, a rule definition languagearray aggregation language (AAL) is developed. This language treats an array as a function from indexes to values and provides syntax and semantics based on monads. External functions can be called in aggregation rules to specify array reading, writing, and aggregating. Based on the features of AAL, array operations are unified as function operations, which can be easily expressed and automatically evaluated. To implement the aggregation approach, a processor for computing aggregates over the base cube and for materializing them in the data warehouse is built, and the component structure and working principle of the aggregation processor are introduced.  相似文献   

5.
Competitive intelligence(CI)is a key factor in helping business leaders gain and maintain competitive advantages.The emergence of big data and Web 2.0 has created new opportunities and more challenges for enterprises to effectively obtain CI.This paper attempts to explore a CI identification method based on strategic factors(SF).By filtering process before CI collection,the core CI,closely related to critical success factors and crisis inducement factors,are identified reliably and efficiently.Based on knowledge element model and multiattribute fusion method,emphasis is placed on the construction of a criterion function by which the SF thesaurus in achieving CI objectives is established.The advantages of this method lie not only in the capability of mining the core CI from massive data,but also in the foundation of efficient CI storage and analysis.This paper is of significance to make a thorough inquiry on CI obtaining and fusing methods of CI system in era of big data.Experiment results verified the feasibility and validity of this study.  相似文献   

6.
7.
The back analysis of initial stress is usually based on measured stress values, but the measuring of initial stress demands substantial investment. Therefore, amounts of underground engineering have no measured initial stress data, such as tunneling engineering. Focusing on this problem, a new back analysis method which does not need measured initial stress data is developed. The fault is assumed to be caused by initial load, the displacement discontinuity method (DDM) which considered non-linear fault is adopted to establish a numerical model of the engineering site, and the multivariable regression analysis of the initial stress field around the faults is carried out based on the fault throw. The result shows that the initial stress field around the faults is disturbed significantly, stress concentration appears in the tip zone, the regressive fault throw matches the measured values well, and the regressive initial stress field is reliable.  相似文献   

8.
A new multi-sensor data fusion algorithm based on EMD-MMSE was proposed.Empirical mode decomposition(EMD)is used to extract the noise of every time series for estimating the variance of the noise.Then minimum mean square error(MMSE)estimator is used to calculate the weights of the corresponding series.Finally,the fused signal is the weighted addition of all these series.The experiments in lab testified the efficiency of this method.In addition,the comparison in fusion time and fusion results with existing fusion method based on wavelet and average technique shows the advantage of this method greatly.  相似文献   

9.
AbstractClassification of intrusion attacks and normal network flow is a critical and challenging issue in network security study. Many intelligent intrusion detection models are proposed, but their performances and efficiencies are not satisfied to real computer networks. This paper presents a novel effective intrusion detection system based on statistic reference model and twin support vector machines (TWSVMs). Moreover, a network flow feature selection procedure has been studied and implemented with TWSVMs. The performances of proposed system are evaluated through using the fifth international conference on knowledge discovery and data mining in 1999 (KDD’99) data set collected at MIT’s Lincoln Labs and the results indicate that the proposed system is more efficient and effective than conventional support vector machines (SVMs) and TWSVMs.  相似文献   

10.
Based on Wiener process model, a new approach for reliability evaluation of cross-linked polyethylene(XLPE) is proposed to improve the lifetime evaluation reliability of XLPE under multi-stressing conditions and study the failure probability distribution. In this paper, two accelerated aging tests are carried out under combined thermal and vibration conditions. The volume resistance degradation data of XLPE samples are tested with a24 h interval under the accelerated stressing conditions at(130℃, 12 m/s~2) and(150℃, 8.5 m/s~2), respectively.Nonlinear degradation data obtained from the experiment are transformed to linear intermediate-variable values using time scaling function, and then linearized degradation data are calculated and evaluated on the basis of linear Wiener process model. Considering traditional Arrhenius equation and inverse power criterion, parameters of the linear Wiener model are estimated according to the maximum likelihood function. The relationship curves on probability density and reliability are given, and the lifetime distribution of XLPE under different stressing conditions is also obtained for evaluating the reliability of XLPE insulation. Finally, the life expectancy of XLPE is 17.9 a under an allowance temperature of 90℃ and an actual vibration acceleration of 0.5 m/s~2. The approach and results in this paper may be used for reliability assessment of high-voltage multiple samples or apparatuses.  相似文献   

11.
Introduction Using data warehousing and on-line analyticalprocessing (OLAP) technology, leaders of businessorganizations can gain insight into the global marketand make sound business decisions quickly to ensurethat the organization will excel in the ever…  相似文献   

12.
统一建模语言(UML)是一种对软件密集系统进行可视化建模的语言,但UML不是形式化的建模语言,缺乏精确的语义描述,因此会导致一些问题。B方法是一种较成熟的软件形式化方法,具有精确性、无二义性等特点。文章用B符号来表示UML类图组成元素的语义及其映射关系,并给出了用B方法来描述UML类图的一种方法。  相似文献   

13.
基于UML工作流模型的应用研究   总被引:1,自引:0,他引:1  
分析了工作流模型的组成,根据统一建模语言的特点,形式化定义了为工作流过程建模的UML活动图结构以及建模规则,给出一个应用实例来描述建模过程并对模型作了详细分析。  相似文献   

14.
研究有时间、资源等约束的开放MAS环境下的Agent模型,提出了混合型Agent的一般性分层框架.该框架包括特性说明层、心智状态层、计算处理层和信息交互层.分析了Agent的体系结构、其在组织中承担的角色和部件组成等.采用UML技术设计Agent的概念类图,给出了Agent的形式化表达和通用程序的设计方法,并阐述了综合调度引擎、并发性消息处理的实现机制.将该框架应用于青藏铁路贯通线广域线路Agent保护研究。设计了多个Agent实现协同保护功能,解决了传统保护的误动问题,表明采用该Agent框架的设计对解决这一复杂问题是有效的.  相似文献   

15.
车站联锁系统行为验证与数据确认的形式化方法   总被引:1,自引:1,他引:0       下载免费PDF全文
车站联锁系统是一种典型的基于数据驱动的安全苛求系统,开发过程中需要对系统行为进行验证并需确认数据的正确性. 为此,通过分析联锁系统的设计规范,基于RODIN平台并使用Event-B语言,辅助使用UML (unified modeling language)图工具快速建立系统的初始模型,以自动生成模型文件并描述出各系统属性与事件流程;基于精化策略分层建模,对各层模型的证明义务进行定理证明,验证了系统的各项属性,得出可靠的通用功能模型;基于实例车站,对模型的公理进行了验证,同时实现了对联锁数据的确认;通过形式化验证过程,结合给定场景联锁数据的有效性确认,发现并纠正系统需求及分析过程中造成的潜在行为缺陷;通过功能仿真与验收测试,进一步确认了通用模型与联锁数据的正确性. 结果表明:本文方法提高了基于模型开发过程的准确性与层次性,验证了系统通用行为状态,且结合公理验证,实现了联锁数据的确认,并能基于模型进行功能场景仿真与测试,从而可进一步提高系统通用功能原型的可靠性.   相似文献   

16.
基于SDSS的高速公路养护管理系统结构   总被引:7,自引:0,他引:7  
现有高速公路养护管理系统仅仅是简单的决策支持系统,以模型来驱动决策,决策的智能化程度不高,因此为了提高公路养护决策的能力和智能化水平,提出了基于SDSS的高速公路养护管理系统,结合智能决策支持系统和商业智能技术,在决策过程中以模型、知识、数据共同驱动决策。综合决策支持系统由数据仓库、联机分析处理、数据挖掘、模型库、知识库和数据库组成,其中数据仓库能够实现对决策主题数据的存储和综合;联机分析处理可以实现多维数据分析;数据挖掘可以挖掘数据库和数据仓库中的知识;模型库可以实现多个广义模型的组合辅助决策;数据库可以为辅助决策提供数据;专家系统可以利用知识推理进行定性分析。它们有机集成的综合决策支持系统将相互补充和依赖,能发挥各自的辅助决策优势,实现更有效的辅助决策。  相似文献   

17.
基于数据仓库和OLAP的民航灾害预警决策支持系统   总被引:4,自引:0,他引:4  
为解决民航灾害预警管理中的多数据源和多目标决策的问题,构建了基于数据仓库和联机分析(OLAP)技术的民航灾害预警决策支持系统.该系统包括应用界面层、辅助决策功能层、库管理及接口层、信息库层和数据层。数据仓库的主题是航空公司、空管中心和机场的预警指标.设计了地域、机构、指标名称和时间4个维度,数据仓库雪花模型,以及年度、季度和月份3种粒度.数据仓库的分割方案是首先按地区维分割,再按机构维分割.#介绍了多维决策分析过程,包括切片、切块和旋转等操作.  相似文献   

18.
为实现汽车制造企业工艺信息网络化管理和车间生产集成化管理协同作业,采用统一建模语言UML对汽车制造工艺信息系统进行分析与建模,针对某制造企业生产实际的工艺工作流程,找出主要用例,确定系统边界,建立其活动图、分析类、顺序图,并给出建模步骤,对其进行总结和评价。  相似文献   

19.
人为因素的多维事故原因分析模型   总被引:4,自引:0,他引:4  
为了提高空中交通安全,将业务流程管理与人为因素分析分类系统相结合,提出了多维事故原因分析模型,确定了模型的使用步骤,并使用此模型对实际案例进行了原因分析。分析结果表明:利用此模型能够引导调查分析人员利用"瑞士奶酪"模型全面查清各工作环节存在的漏洞;工作流程的细致程度和完整性与分析结果的详细程度和全面性成正比;原因分类的层次越深,分析结果越详细;该模型应用在工作流程相对固定的领域可优化工作流程,提高系统可靠性;将工作流程及各原因之间的关系存储在数据库中,可便于事件重现,为风险评价与安全预警提供有用数据。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号