首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1127篇
  免费   12篇
公路运输   222篇
综合类   312篇
水路运输   74篇
铁路运输   90篇
综合运输   441篇
  2022年   7篇
  2021年   17篇
  2020年   9篇
  2019年   5篇
  2018年   55篇
  2017年   41篇
  2016年   53篇
  2015年   71篇
  2014年   78篇
  2013年   86篇
  2012年   60篇
  2011年   103篇
  2010年   41篇
  2009年   56篇
  2008年   65篇
  2007年   120篇
  2006年   100篇
  2005年   78篇
  2004年   40篇
  2003年   17篇
  2002年   10篇
  2001年   16篇
  2000年   2篇
  1999年   3篇
  1998年   3篇
  1997年   2篇
  1994年   1篇
排序方式: 共有1139条查询结果,搜索用时 15 毫秒
891.
Establishment of industry facilities often induces heavy vehicle traffic that exacerbates congestion and pavement deterioration in the neighboring highway network. While planning facility locations and land use developments, it is important to take into account the routing of freight vehicles, the impact on public traffic, as well as the planning of pavement rehabilitation. This paper presents an integrated facility location model that simultaneously considers traffic routing under congestion and pavement rehabilitation under deterioration. The objective is to minimize the total cost due to facility investment, transportation cost including traffic delay, and pavement life-cycle costs. Building upon analytical results on optimal pavement rehabilitation, the problem is formulated into a bi-level mixed-integer non-linear program (MINLP), with facility location, freight shipment routing and pavement rehabilitation decisions in the upper level and traffic equilibrium in the lower level. This problem is then reformulated into an equivalent single-level MINLP based on Karush–Kuhn–Tucker (KKT) conditions and approximation by piece-wise linear functions. Numerical experiments on hypothetical and empirical network examples are conducted to show performance of the proposed algorithm and to draw managerial insights.  相似文献   
892.
The rapid-growth of smartphones with embedded navigation systems such as GPS modules provides new ways of monitoring traffic. These devices can register and send a great amount of traffic related data, which can be used for traffic state estimation. In such a case, the amount of data collected depends on two variables: the penetration rate of devices in traffic flow (P) and their data sampling frequency (z). Referring to data composition as the way certain number of observations is collected, in terms of P and z, we need to understand the relation between the amount and composition of data collected, and the accuracy achieved in traffic state estimation. This was accomplished through an in-depth analysis of two datasets of vehicle trajectories on freeways. The first dataset consists of trajectories over a real freeway, while the second dataset is obtained through microsimulation. Hypothetical scenarios of data sent by equipped vehicles were created, based on the composition of data collected. Different values of P and z were used, and each unique combination defined a specific scenario. Traffic states were estimated through two simple methods, and a more advanced one that incorporates traffic flow theory. A measure to quantify data to be collected was proposed, based on travel time, number of vehicles, penetration rate and sampling frequency. The error was below 6% for every scenario in each dataset. Also, increasing data reduced variability in data count estimation. The performance of the different estimation methods varied through each dataset and scenario. Since the same number of observations can be gathered with different combinations of P and z, the effect of data composition was analyzed (a trade-off between penetration rate and sampling frequency). Different situations were found. In some, an increase in penetration rate is more effective to reduce estimation error than an increase in sampling frequency, considering an equal increase in observations. In other areas, the opposite relationship was found. Between these areas, an indifference curve was found. In fact, this curve is the solution to the optimization problem of minimizing the error given any fixed number of observations. As a general result, increasing sampling frequency (penetration rate) is more beneficial when the current sampling frequency (penetration rate) is low, independent of the penetration rate (sampling frequency).  相似文献   
893.
Active Traffic Management (ATM) systems have been emerging in recent years in the US and Europe. They provide control strategies to improve traffic flow and reduce congestion on freeways. This study investigates the feasibility of utilizing a Variable Speed Limits (VSL) system, one key part of ATM, to improve traffic safety on freeways. A proactive traffic safety improvement VSL control algorithm is proposed. First, an extension of the METANET (METANET: A macroscopic simulation program for motorway networks) traffic flow model is employed to analyze VSL’s impact on traffic flow. Then, a real-time crash risk evaluation model is estimated for the purpose of quantifying crash risk. Finally, optimal VSL control strategies are achieved by employing an optimization technique to minimize the total crash risk along the VSL implementation corridor. Constraints are setup to limit the increase of average travel time and the differences of the posted speed limits temporarily and spatially. This novel VSL control algorithm can proactively reduce crash risk and therefore improve traffic safety. The proposed VSL control algorithm is implemented and tested for a mountainous freeway bottleneck area through the micro-simulation software VISSIM. Safety impacts of the VSL system are quantified as crash risk improvements and speed homogeneity improvements. Moreover, three different driver compliance levels are modeled in VISSIM to monitor the sensitivity of VSL effects on driver compliance. Conclusions demonstrated that the proposed VSL system could improve traffic safety by decreasing crash risk and enhancing speed homogeneity under both the high and moderate compliance levels; while the VSL system fails to significantly enhance traffic safety under the low compliance scenario. Finally, future implementation suggestions of the VSL control strategies and related research topics are also discussed.  相似文献   
894.
A variety of sensor technologies, such as loop detectors, traffic cameras, and radar have been developed for real-time traffic monitoring at intersections most of which are limited to providing link traffic information with few being capable of detecting turning movements. Accurate real-time information on turning movement counts at signalized intersections is a critical requirement for applications such as adaptive traffic signal control. Several attempts have been made in the past to develop algorithms for inferring turning movements at intersections from entry and exit counts; however, the estimation quality of these algorithms varies considerably. This paper introduces a method to improve accuracy and robustness of turning movement estimation at signalized intersections. The new algorithm makes use of signal phase status to minimize the underlying estimation ambiguity. A case study was conducted based on turning movement data obtained from a four-leg signalized intersection to evaluate the performance of the proposed method and compare it with two other existing well-known estimation methods. The results show that the algorithm is accurate, robust and fairly straightforward for real world implementation.  相似文献   
895.
Acoustic-based mix design is still far from achieving a clear and accepted rationale. The three main dominions (generation, absorption, propagation) which affect pavement acoustic performance involve a number of acoustic parameters. Their relationship with pavement properties is scarcely or insufficiently known. In more detail, the parameters that define the acoustic coupling between the two phases that comprise a porous material are: porosity, resistivity, tortuosity, and viscous and thermal factors. Consequently, the spectrum of a pavement absorption coefficient depends, in particular, on tortuosity, whose relationship with HMA (hot mix asphalt) bulk properties is still an issue.Given that, the study described in this paper aimed at: (i) assessing the effect of the tortuosity on the absorption coefficient of a pavement layer; (ii) assessing the dependence of tortuosity on mix design parameters and/or mix properties; (iii) deriving a straightforward algorithm to estimate the effect of tortuosity-related properties on the absorption coefficient.Based on the above issues, an experimental plan was designed and carried out in order to study these relationships and set out a tentative theoretical and practical framework. The relationships between acoustic and traditional bulk properties of pavement mixtures were analysed. Acoustic models and hydraulic analogies were considered and, based on them, relationships were formalised and submitted to experimental validations. A simple relationship to derive tortuosity from nominal maximum aggregate size and thickness was derived. This relationship was used to derive the frequency of the first peak of the absorption spectrum, based on HMA properties. Nominal maximum aggregate size and lift thickness emerged as key factors in patterning peak frequency.Future research will address a number of issues among which the following can be listed: synergetic assessment of the influence of HMA properties on the absorption coefficient over the entire spectrum, synergetic consideration of generation and absorption factors. Practical benefits and outcomes are expected for both practitioners and researchers.  相似文献   
896.
Development of strategies to control urban air pollution is a complex process involving a wide range of sciences. In this study a system dynamics model is proposed in order to estimate the behavior of parameters affecting air pollution in Tehran. The proposed model includes two subsystems: (1) urban transportation, (2) air polluting industries. In this paper, several policies are proposed to mitigate air pollution. The proposed model is simulated under several scenarios using historical data of transportation and industrial sectors in Tehran. Policies are categorized as: (1) road construction, (2) technology improvement in fuel and automotive industries, (3) traffic control plans, (4) development of public transportation infrastructures. The results show effectiveness of the proposed policies. In this case, technology improvement in fuel and automotive industries and development of public transportation infrastructures are more effective policies in order to reduce air pollution.  相似文献   
897.
In 2008 the regional government of Catalonia (Spain) reduced the maximum speed limit on several stretches of congested urban motorway in the Barcelona metropolitan area to 80 km/h, while in 2009 it introduced a variable speed system on other stretches of its metropolitan motorways. We use the differences-in-differences method, which enables a policy impact to be measured under specific conditions, to assess the impact of these policies on emissions of NOx and PM10. Empirical estimation indicate that reducing the speed limit to 80 km/h causes a 1.7–3.2% increase in NOx and 5.3–5.9% in PM10. By contrast, the variable speed policy reduced NOx and PM10 pollution by 7.7–17.1% and 14.5–17.3%. As such, a variable speed policy appears to be a more effective environmental policy than reducing the speed limit to a maximum of 80 km/h.  相似文献   
898.
This paper analyses the driving cycles of a fleet of vehicles with predetermined urban itineraries. Most driving cycles developed for such type of vehicles do not properly address variability among itineraries. Here we develop a polygonal driving cycle that assesses each group of related routes, based on microscopic parameters. It measures the kinematic cycles of the routes traveled by the vehicle fleet, segments cycles into micro-cycles, and characterizes their properties, groups them into clusters with homogeneous kinematic characteristics within their specific micro-cycles, and constructs a standard cycle for each cluster. The process is used to study public bus operations in Madrid.  相似文献   
899.
An aggregate air traffic flow model based on a multicommodity network is used for traffic flow management in the National Airspace System. The problem of minimizing the total travel time of flights in the National Airspace System of the United States, subject to sector capacity constraints, is formulated as an Integer Program. The resulting solution achieves optimal delay control. The Integer Program implemented for the scenarios investigated has billions of variables and constraints. It is relaxed to a Linear Program for computational efficiency. A dual decomposition method is applied to solve the large scale Linear Program in a computationally tractable manner. A rounding algorithm is developed to map the Linear Program solution to a physically acceptable result, and is implemented for the entire continental United States. A 2-h traffic flow management problem is solved with the method.  相似文献   
900.
Trajectories drawn in a common reference system by all the vehicles on a road are the ultimate empirical data to investigate traffic dynamics. The vast amount of such data made freely available by the Next Generation SIMulation (NGSIM) program is therefore opening up new horizons in studying traffic flow theory. Yet the quality of trajectory data and its impact on the reliability of related studies was a vastly underestimated problem in the traffic literature even before the availability of NGSIM data. The absence of established methods to assess data accuracy and even of a common understanding of the problem makes it hard to speak of reproducibility of experiments and objective comparison of results, in particular in a research field where the complexity of human behaviour is an intrinsic challenge to the scientific method. Therefore this paper intends to design quantitative methods to inspect trajectory data. To this aim first the structure of the error on point measurements and its propagation on the space travelled are investigated. Analytical evidence of the bias propagated in the vehicle trajectory functions and a related consistency requirement are given. Literature on estimation/filtering techniques is then reviewed in light of this requirement and a number of error statistics suitable to inspect trajectory data are proposed. The designed methodology, involving jerk analysis, consistency analysis and spectral analysis, is then applied to the complete set of NGSIM databases.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号