首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   580篇
  免费   14篇
公路运输   105篇
综合类   92篇
水路运输   126篇
铁路运输   184篇
综合运输   87篇
  2022年   8篇
  2021年   12篇
  2020年   13篇
  2019年   3篇
  2018年   21篇
  2017年   12篇
  2016年   25篇
  2015年   11篇
  2014年   43篇
  2013年   23篇
  2012年   33篇
  2011年   43篇
  2010年   28篇
  2009年   26篇
  2008年   32篇
  2007年   58篇
  2006年   66篇
  2005年   35篇
  2004年   25篇
  2003年   17篇
  2002年   9篇
  2001年   15篇
  2000年   7篇
  1999年   7篇
  1998年   8篇
  1997年   3篇
  1996年   4篇
  1995年   1篇
  1994年   3篇
  1993年   1篇
  1992年   1篇
  1991年   1篇
排序方式: 共有594条查询结果,搜索用时 602 毫秒
51.
This paper shows how to recover the arrival times of trains from the gate times of metro passengers from Smart Card data. Such technique is essential when a log, the set of records indicating the actual arrival and departure time of each bus or train at each station and also a critical component in reliability analysis of a transportation system, is missing partially or entirely. The procedure reconstructs each train as a sequence of the earliest exit times, called S-epochs, among its alighting passengers at each stations. The procedure first constructs a set of passengers, also known as reference passengers, whose routing choices are easily identifiable. The procedure then computes, from the exit times of the reference passengers, a set of tentative S-epochs based on a detection measure whose validity relies on an extreme-value characteristic of the platform-to-gate movement of alighting passengers. The tentative S-epochs are then finalized to be a true one, or rejected, based on their consistencies with bounds and/or interpolation from prescribed S-epochs of adjacent trains and stations. Tested on 12 daily sets of trains, with varying degrees of missing logs, from three entire metro lines, the method restored the arrival times of 95% of trains within the error of 24 s even when 100% of logs was missing. The mining procedure can also be applied to trains operating under special strategies such as short-turning and skip-stop. The recovered log seems precise enough for the current reliability analysis performed by the city of Seoul.  相似文献   
52.
The effectiveness of traditional incident detection is often limited by sparse sensor coverage, and reporting incidents to emergency response systems is labor-intensive. We propose to mine tweet texts to extract incident information on both highways and arterials as an efficient and cost-effective alternative to existing data sources. This paper presents a methodology to crawl, process and filter tweets that are accessible by the public for free. Tweets are acquired from Twitter using the REST API in real time. The process of adaptive data acquisition establishes a dictionary of important keywords and their combinations that can imply traffic incidents (TI). A tweet is then mapped into a high dimensional binary vector in a feature space formed by the dictionary, and classified into either TI related or not. All the TI tweets are then geocoded to determine their locations, and further classified into one of the five incident categories.We apply the methodology in two regions, the Pittsburgh and Philadelphia Metropolitan Areas. Overall, mining tweets holds great potentials to complement existing traffic incident data in a very cheap way. A small sample of tweets acquired from the Twitter API cover most of the incidents reported in the existing data set, and additional incidents can be identified through analyzing tweets text. Twitter also provides ample additional information with a reasonable coverage on arterials. A tweet that is related to TI and geocodable accounts for approximately 5% of all the acquired tweets. Of those geocodable TI tweets, 60–70% are posted by influential users (IU), namely public Twitter accounts mostly owned by public agencies and media, while the rest is contributed by individual users. There is more incident information provided by Twitter on weekends than on weekdays. Within the same day, both individuals and IUs tend to report incidents more frequently during the day time than at night, especially during traffic peak hours. Individual tweets are more likely to report incidents near the center of a city, and the volume of information significantly decays outwards from the center.  相似文献   
53.
This paper aims at demonstrating the usefulness of integrating virtual 3D models in vehicle localization systems. Usually, vehicle localization algorithms are based on multi-sensor data fusion. Global Navigation Satellite Systems GNSS, as Global Positioning System GPS, are used to provide measurements of the geographic location. Nevertheless, GNSS solutions suffer from signal attenuation and masking, multipath phenomena and lack of visibility, especially in urban areas. That leads to degradation or even a total loss of the positioning information and then unsatisfactory performances. Dead-reckoning and inertial sensors are then often added to back up GPS in case of inaccurate or unavailable measurements or if high frequency location estimation is required. However, the dead-reckoning localization may drift in the long term due to error accumulation. To back up GPS and compensate the drift of the dead reckoning sensors based localization, two approaches integrating a virtual 3D model are proposed in registered with respect to the scene perceived by an on-board sensor. From the real/virtual scenes matching, the transformation (rotation and translation) between the real sensor and the virtual sensor (whose position and orientation are known) can be computed. These two approaches lead to determine the pose of the real sensor embedded on the vehicle. In the first approach, the considered perception sensor is a camera and in the second approach, it is a laser scanner. The first approach is based on image matching between the virtual image extracted from the 3D city model and the real image acquired by the camera. The two major parts are: 1. Detection and matching of feature points in real and virtual images (three features points are compared: Harris corner detector, SIFT and SURF). 2. Pose computation using POSIT algorithm. The second approach is based on the on–board horizontal laser scanner that provides a set of distances between it and the environment. This set of distances is matched with depth information (virtual laser scan data), provided by the virtual 3D city model. The pose estimation provided by these two approaches can be integrated in data fusion formalism. In this paper the result of the first approach is integrated in IMM UKF data fusion formalism. Experimental results obtained using real data illustrate the feasibility and the performances of the proposed approaches.  相似文献   
54.
Risk-based inspection is nowadays the predominant approach to structural integrity assurance in complex engineering systems, such as those designed for and operated in deepwater environments. One of the major tasks in risk-based planning is risk ranking of the components comprising an engineering system. In this paper, a mathematical model based on data envelopment analysis is developed for this purpose, whereby two types of weights are employed: subjective judgmental weights which are provided as input, and objective which constitute the output of the model. The use of the mathematical model is illustrated in a real world system of subsea flexible pipes, which operational in the south Atlantic Shelf off the Southeast Coast of Brazil.  相似文献   
55.
船舶结构振动自动测试与数据处理系统   总被引:2,自引:2,他引:0  
本文介绍船舶结构振动自动测试与数据处理系统的组建及相关的计算机软硬件。使用本系统替代传统的光电和磁带记录仪进行实船结构振动测试,具有实时分析强、操作灵活、便于组成数据库等优点。  相似文献   
56.
利用三维测量仪ATOSⅡ测量系统、点云处理软件Surfacer、三维建模软件UGNX3.0等,以设计效果图的油泥造型或现成实物为对象,结合逆向工程和工业设计特点,对逆向工程速成型在工业设计中的检测分析和建模问题进行了探讨和研究。逆向工程技术已经广泛应用到新产品的开发、旧零件的还原以及产品的检测中,它不仅消化和吸收了实物原型和油泥模型,还能修改再设计以制造出新的产品。  相似文献   
57.
地理信息系统中数据库间数据交换技术的研究   总被引:1,自引:1,他引:0  
鉴于地理信息系统中数据交换信息量巨大,提出利用数据交换通信平台(DECP)作为中间层,来实现异构数据库之间的数据传输和交换。DECP由数据访问层、业务逻辑层、安全代理、数据传送代理、客户代理、消息层、事务管理器七个部分组成。DECP所采用的关键技术是基于代理的安全技术、系统调度、异构数据库的访问和事务的调度和管理。  相似文献   
58.
于立 《汽车文摘》2020,(3):37-40
整车企业系统之间,因为底层代码不开放,数据结构逻辑未知等困难,无法实现精准对接。致力于通过大数据分析,数据结构排列组合等方法,验证出未开放系统的底层数据逻辑,实现数据关键字段管理。得出工程数据系统关键字段,不仅包含业务信息,还存在内核字段的结论。在无法获知底层封装代码的情况下,找到一条系统间从数据本身研究数据底层逻辑结构的有效论证和系统开发方法,建立精准变更对接管理规则,实现数据快速准确传递。  相似文献   
59.
一种基于数据挖掘的GIS及在航海中的应用   总被引:1,自引:0,他引:1  
根据聚类分析方法中密度凝聚的思想,提出一种新的复合聚类分析算法,进一步将这种算法用于地理信息系统的数据挖掘,并应用于船舶航线的自动设计。  相似文献   
60.
车站通过能力计算中的数据处理方法   总被引:1,自引:0,他引:1  
采集完备而合理的原始数据是计算车站通过能力的基础。为解决客观环境及人为因素造成采集数据失真和影响可靠性的问题,一方面运用灰色系统理论中的弱化缓冲原理进行数据预处理,即从原始数据序列中提取出显著偏离序列的数据元素,采用二阶缓冲数据处理方法进行数据弱化,并将预处理后的数据对应替代原始数据序列中被提取的数据;另一方面,运用数理统计方法推导占用时间标准计算公式中的权重系数,以权重系数替代传统公式中的简单算术平均数。给出数据预处理方法和确定权重系数的具体步骤,并以重庆站成都方向至到发场第2股道的接车作业占用咽喉时间实际数据处理为例,验证了所给出方法可有效减轻随机扰动对原始数据采集的不利影响,使设备占用时间标准的确定更加合理。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号