LINK TO NEW ALADIN WEB
Lampe ALADIN
 ALADIN Consortium
 ALADIN Documents
 ALADIN Model

THIRD MEDIUM-TERM (2002-2004) RESEARCH PLAN FOR ALADIN

(postscript version)

INTRODUCTION

This third research plan is based on the conclusions of the four working groups of the 10th ALADIN workshop (7-8 June 2001), on the main orientations discussed during the ALATNET training course on data assimilation (11-22 June 2001), on deported contributions, and finally on the reports of the twelve working groups meeting in Toulouse to prepare this document (10-14 December 2001). Most of partners were involved in these discussions, which covered the following individual topics : training, maintenance, operations, applications, verification, coupling, dynamics, physics, "upperair" data assimilations - methods, "upperair" data assimilations - observations, "surface" analysis, predictability, and the organization of research. Experience from the two previous plans and the diverging demands from managers and scientists led to a mixed description of research topics : first presenting the main objectives and their relative importance and scheduling, second describing more precisely the "why" and "how". The parts dealing with the environment of research are handled differently, of course. The evaluation of the second research plan will be presented as a separate document, available later.

The ALADIN project has arrived to an age when it may be (and often is) considered as mature. The organization of research must evolve consistently, with increasing responsibilities for each partner, a higher part of local research, and a more intensive networking (efficient distance tutorial, more e-mail exchanges, and frequent meetings of small highly specialized working groups). This is necessary if we want to reach the main goals of the new research plan, i.e. marching towards very high resolution and continuous data assimilation while preserving if not improving the current level of response to operational problems. And this is "de facto" imposed by the sharp cut of the fundings supporting up to now research stays in Toulouse and by the uncertain future of RC-LACE.

TRAINING

Basic training on ALADIN is now performed in Toulouse for new partners, and at home whenever there is a (pre-) operational ALADIN suite running. This was a major step forward performed along the last years, and this must be preserved.

Advanced training, including the first steps in research, is on a more tricky path. This task was traditionally assigned to the Toulouse and Prague centres, and more recently to the ALATNET training courses. But the corresponding support are likely to be drastically reduced, and the last (and third) ALATNET seminar will be organized in May 2002. The following palliative measures are considered, all necessary and none enough to cope with the challenge :

  • an increased and joint effort from all partners to support centralized training ; the current target length of stays, 1.5 to 2 months, must be kept;

  • preserving long stays in Toulouse for maintenance and training, through an improved organization;

  • more advertising on affordable international schools on NWP topics;

  • a more complete and structured documentation;

  • a local implementation of the 1d and 2d versions of the ALADIN model by each partner, and "free" time left to use them in order to complete training; this is also true for research.

PhD theses have proved their importance along the last years for the emergence of local research and training. Almost all partners are now aware of their usefulness. This effort must go on and become an "across the board" feeling for the ALADIN community.

MAINTENANCE

Maintenance covers phasing (to update the model taking into account the latest technical or scientific developments and keep consistency with ARPEGE), optimization, documentation, and a few other technical tasks of common interest. It is essential for the life and progress of the ALADIN project, especially now when facing ambitious research objectives, a new organization of research and increasing discrepancies between local computational resources (concerning either power or architecture). The previous rules for an effective and well balanced contribution from all partners remain valid, even if there was one successful experience of deported phasing along the last years. However some emerging problems need to be underlined.

The increasing part of data assimilation in research and operations leads to reconsider the organization of phasing teams and schedule. The delay between ARPEGE and ALADIN updates should be reduced to avoid too large drifts. And more experts on data assimilation must be trained to allow some turn-over.

The very large discrepancies between local versions in the present situations are really worrying, with a lag of 6 full cycles between the oldest and the most recent ones. This is a penalty for both operations, since "back-phasing" of the latest improvements gets more and more difficult if not impossible, and for research, by lack of a common reference. The solution to this problem relies on the following actions :

  • very strict deadlines for the update of local versions will be set; most ALADIN teams are now quite experienced in porting, and distance help is often available; the main problems may be related to data assimilation;

  • more care must be devoted to "export" versions by the Toulouse team : rationalization of updates and a detailed documentation of the reference and of each deviation, concerning not only the scientific content, but also the link to operational changes, the practical aspects (portability, changes in namelists, ...) and the correspondence with available "views" in the source code manager (clearcase);

  • no "back-phasing" on export versions;

  • more frequent update of the default values of namelist parameters, consistently with operational changes;

  • definition of a reference framework for research, and more advertising on the basic rules to be applied when moving to a new configuration (higher resolution, new domain extension, linear grid, ...); this must be put on the ALADIN web site.

Portability and low computational cost are two major assets of ALADIN, and require sustained attention. The present widening of the panel of architectures for NWP-devoted computers (starting from the ARPEGE/IFS level !) may easily lead to unexpected drifts, especially in memory requirements. A watch task is to be implemented within phasing and the design of interfaces or coding methods should take this into account.. A documentation on the data flow in the model is required, since it was significantly modified along the latest cycles. This is necessary for both developpers and phasers, and would help controlling the cost of the model.

The maintenance of scripts is a new issue, mainly resulting from the emergence of data assimilation in operations : blending, observation management, verification tools based on optimal interpolation, ... This could be achieved via a management of scripts (including the associated namelists and pieces of code) within clearcase, but requires a significant initial effort to ensure portability.

To end with, ALADIN should take an active part in the new EWGLAM/SRNWP initiative for the design of common interfaces and procedures for NWP limited-area models.

OPERATIONS

The issues addressed here are obviously of a more continuous type than the ones linked to such or such specific scientific item. Hence the planning is not declined here in terms of priority and/or manpower but rather on aims of permanent nature that are to be fulfilled in any case.

The problems of non-ascending compatibility of the code (impossibility to recreate the old status from the new one simply through namelist changes) are detrimental to a good validation but it is recognised that they are quasi unavoidable for some bug-corrections and important cleaning efforts. It is thus simply recommended to avoid unnecessary use of this facility, but no strict ban is put on it. A more regular update of local versions might help a bit in this direction. In a similar line of thoughts, it should be useful to set-up deadlines beyond which central maintenance of old ALADIN cycles will not be ensured any more. This should help minimising the unwelcome current trend towards more and more spread of the operationally used source codes.

Météo-France is currently running 4 networks per day for ARPEGE and ALADIN-France with the following ranges and "short" cut-off windows (for ARPEGE assimilation) :

00 UTC

06 UTC

12 UTC

18 UTC

ARPEGE

96 h

42 h (soon 54?)

72 h

30 h (soon 54?)

( 1h50 )

( 3h00 )

( 1h50 )

( 3h00 )

ALADIN-France

48 h

42 h

36 h

30 h

The cut-off rules for the production are very unlikely to change for the forthcoming two years. For data assimilation purposes, the "long" cut-off windows are all below 12h. Presently, Météo-France is the only partner producing 4 ALADIN forecasts a day (and also forecasts shorter than 48h). After more than one year of experience the main interest in having four rather than two prediction networks per day appear to be on the predictability side : in case of critical and/or uncertain situations the ensemble of "verifying" forecasts valid at the same time is twice as rich as previously and helps to get a first probabilistic look to short-range forecasting. However it implies changes in the organization of forecasting. In case other partners wish to go in this direction, modifications of the cut-off rules (scheduling the availability of coupling files) should receive more attention at Météo-France. Similarly a backup solution should be defined for cases of delay or impeachment of one of the short cut-off forecasts.

In case of request for ALADIN coupling files beyond 48 hours, additional coupling files, and if necessary ARPEGE files, could be produced for that sole purpose. But a minimum coordination is required, for applications sharing the same coupling files or in the case of embedded models. It has to be reminded here that the problem of the coupling frequency or method for remote applications is still a very open one from the scientific point of view. For operations, choices are strongly influenced by the the capacities of the transmission network.

To anticipate such changes, it is necessary to have from now a clear view of the scheduling of some typical ALADIN operational suites : ALADIN-France (4 networks and coupling), ALADIN-LACE (blending and coupling), ALBACHIR (data assimilation), one SELAM ALADIN suite (lagged coupling), ALADIN-Belgium (blending and data assimilation in a nested model) at least. Others are welcome. For the short range this could also help adjusting the starting point of requests for coupling files.

The transmission of CMA-type observation files (or their ODB equivalent) must disappear from the ALADIN operational networking scheme, as soon as possible. There are indeed too many associated incompatibilities with the international rules on raw data exchanges. Partners should be now able to build and manage their own observations database, especially those running an assimilation suite.

The situation concerning the harmonisation between the 4D-Var data assimilation of ARPEGE and the ALADIN cycling seems to be controlled in two possible ways: (1) blending like in the ALADIN-LACE application and (2) data assimilation with guess produced by ALADIN forecasts coupled to ARPEGE analysed states at both ends of the 6 hour ranges (ALBACHIR solution). One could also rename these two options "spectral" and "lateral gridpoint" blending. Given this wealth of opportunities to avoid too big discrepancies with the ARPEGE cycle it is recommended not to use any system where the ARPEGE influence is limited to the lateral boundary forcing of the 6h forecast. Open problems remain in the choice of time-consistent versus space-consistent initial state of the coupling files and in the choice of standard versus incremental initialization for the production runs. In both cases the solutions may depend on whether there is an analysis performed or not.

To end with, it has to be mentioned that the move to a "linear grid" configuration (i.e. an increase of the sole spectral resolution) in an operational application may lead to some noise problems for the embedded ALADIN models, since a "quadratic grid" configuration will be kept for the intermediate transmission domains. Known solutions are a retuning of initialization or the implementation of blending, but this issue is still worked at.

APPLICATIONS

New fields for post-processing

Thanks to recent changes in the Full-Pos code the introduction of new post-processed fields is far more easier now. The demand for new fields and the corresponding requirements have to be expressed by users or scientific community.

Newly introduced fields concern: the duration of precipitation, altitude of q'w =0° C, altitude of T=-10° C. The fields which were discussed to be introduced in the near future are: probabilistic aspect of precipitation, brightness temperature and radiances.

For the time being there is no possibility of using post-processing in some configurations of the model, such as TL/AD. Introducing this option would be very helpful for research, especially concerning variational assimilation.

Available tools for operations and research

There were preliminary efforts done for a coordinated exchange of applications. There is a rather raw and outdated list of available applications. This list must be updated and put on the ALADIN web site with all the details. On top of that there could be an ftp account available for exchange of applications. Another solution could be to manage the main tools under clearcase, as is also proposed for scripts. This would require a significant initial work, to update codes and make them portable, and some more work at each phasing, but would save time for each team and might even avoid some bugs (such as those induced by the lack of consistency between model and tools).

Interface to downstream applications

ALADIN products are used in variety of applications (like dispersion, snow, wave, hydrological models, etc ... ). All those applications are rather specific and used just in particular centres. However it would be preferred to have the interfaces registered in order to avoid duplication of this work whenever possible.

Statistical adaptation

Different statistical adaptation methods are used in some ALADIN countries. They have in general very big positive impact on the verification scores, but they are again rather specific and not very easy transferable between ALADIN partners. But there should be a transfer of knowledge and experience between ALADIN partners about usage of statistical methods and the obtained results, mainly via the ALADIN Newsletters and more involvement in the corresponding SRNWP networking.

Dynamical adaptation

Dynamical adaptation is used in few countries. In the countries where is it used there is general positive opinion about the benefits on wind forecasts but further tuning might be necessary. On top of that, a systematic verification of the results of dynamical adaptation is missing.

Moreover research on this topic must go on. Full physics dynamical adaptation experiments have revealed a divergence of some forecast fields when going towards fine-scale (i.e. 2.5 km) resolutions. The precipitation and vertical velocity are the most involved fields. This behaviour is observed identically for hydrostatic and non-hydrostatic models. This problem has for the moment received no satisfactory explanation, and must be studied more in details, trying to separate first the respective impact of dynamics and individual physical processes.

Model to satellite approach

A "model to satellite" application has just been designed. Although the computation method of radiances and the brightness temperature fields is very expensive, the results are very promising. This can be for example the right way towards new alternative verification methods.

Diag-Pack

This application consist in hourly analyses of surface fields (to get as close as possible to the most recent observations) and a set of indices to help the forecaster in the very short term prediction of extreme events. It is now implemented in several centres. However only the first part is effectively used by forecasters. Some more explanations about the diagnostic fields appear necessary. And of course the research work on this topic must be pursued (new fields, impact of orography, ...).

VERIFICATION

Introduction

This point is not strictly and only a topic of research but routine verification, including the comparison to observations, to other ALADIN models, to other forecasting systems is crucial to track deficiencies and steer further developments. However past attempts to build a very basic coordinated verification project failed while there is more and more interest in the international community for this topic, especially with the emergence of high resolution modelling. To avoid further drifts, some basic streamlines are defined, and the partners are asked to mandate a specific project leader.

Main objectives

Aim

Importance

First target dates and required effort

Building a coordinated procedure

for objective verification

at synoptic scales

HIGH

Definition of rules, implementation of the database :

spring 2002 ; 3 p.m. + 1 p.w. per partner, once

Routine update of the database :

from spring 2002 ; 1 p.w. + 2 p.d. per partner, per month

Improvements and diffusion of results :

from spring 2002 ; 2 p.m. per year

Defining a verification procedure

for high resolution forecasts

HIGH

Definition of a working group, involving modellers and forecasters :

spring 2002

Use of satellite and radar data (precipitations) :

first applications to subjective verification end 2002

Safe exchange of local observations between ALADIN partners :

starting mid 2002

Units are : p.m. : person × month, p.w. : person × week, p.d. : person × day; for research on mesoscale verification it is more difficult to quantify the the required manpower (1 p.m. per partner per year ?) and the schedule, strongly linked to progress in data assimilation and political willingness. European partners should at least send representatives to the SRNWP workshops on "verification methods", organized every two years by HIRLAM / KNMI (next in 2003).

Background

The situation regarding the verification of model forecasts even by the means of conventional scores (typically against SYNOP and TEMP observations) is far from being satisfactory within the ALADIN countries. At many places the traditional verification is done only partially, for example only against the stations of the country, or not continuously in time, or the scores are simply computed and not really evaluated, nor exchanged. The tools employed to compute the departure between the model forecasts and observations are also different from one country to another.

It is clear that the conventional scores situation must become clean first, before going to the more sophisticated verification using the imagery data, for example. To achieve this goal is not easy, since the previous verification projects launched in the past always failed. Several reasons contributed to this : lack of manpower and specific control teams, hardly any exchange of local observational data between neighbouring partners, no standard verification procedure defined (though not really necessary at this stage), and probably also a lack of motivation. The verification is then done either by the modellers themselves (which is mostly unpopular job) or by the forecasters (mostly subjectively) when they have some time left.

On the other hand the model verification is a very important thing and must not be further neglected. A common verification project would help tracking problems in operational suites (when looking at areas covered by several applications) or check the impact of increased resolution, for nested models.

Subjective verification and case studies are not explicitly mentioned here, for they are not on such a critical way.

Objective verification at the synoptic scale

The proposed program is the following :

· creation of local observations databases (this should be already done by most partners)

· update of the list of contact points for verification (done recently)

· definition of a list of reliable SYNOP and TEMP stations, available to all partners and of known quality, covering all the operational ALADIN domains (each partner is responsible for its national observational network; some countries already sent their list)

¨ definition of a procedure to extract quality flags for these observations, at Météo-France, from operational monitoring and assimilation

¨ definition of the list of parameters to be controlled and a suitable format to exchange and store data; definition of the architecture of the database and development of management tools

· effective exchange of information, by monthly mails, and associated update of the verification database : forecasted fields at each observation point are required, obtained from the initial or post-processing grid according to what is in practice available to forecasters

¨ regular, daily, update of the database with observations and quality flags at Météo-France

¨ update of the database with the corresponding fields for ARPEGE, as a reference covering all domains, at Météo-France

¨ regular computation and inter-comparison of scores

· introduction of new parameters or observations

The tasks pointed by a "· " are under the responsibility of each partner, those pointed by a " ¨" will mainly involve a small working group (first meeting early March 2002), the French verification team and maintenance stays in Toulouse.

Towards mesoscale verification

The classical verification scores, computed against the public SYNOP and TEMP observations, do not provide a satisfactory response to validate the mesoscale forecasts, just some indication about the robustness of the model. The problem is the insufficient density of these conventional observations.

The imagery data, from satellite or radar, can be used to complete the traditional scores. A "model to satellite" tool is already available and should be adapted to subjective then objective verification. The work on radar data could start from 2002 through a more closer cooperation with the HIRLAM group.

The design of a new methodology is also of great importance and cannot be successful without a more intense international cooperation on this topic : within the ALADIN partnership (a small working group should emerge from the next ALADIN workshop), with the French MESO project, within EWGLAM/SRNWP initiatives.

The exchange of all the available local data (additional SYNOP-type stations, rain-gauge hydrological stations, etc...) between the ALADIN countries would be of a great help for the verification (and also data assimilation) purposes in order to get more dense mesoscale networks over larger territories. For the needs of mesoscale modelling, continuous political and technical efforts are necessary to get around the present obstacles while preserving safety.

It was stressed out that the effort on verification has a lot in common with the one on data assimilation. Therefore it would be very useful to achieve a synergy of these two goals. There are a few examples of the common points : management of an observations database, computation of the forecast departure from the observation, quality control of data, removal of the systematic errors from the observed data, density of data networks and so on.

COUPLING

Introduction

This topic was not so much considered in the previous research plan. However, severe operational forecast failures (when ALADIN missed rapidly moving storms correctly predicted by ARPEGE), variational sensitivity studies (preparing the management of lateral boundary conditions within a 4d-var) and case studies at very high resolution demonstrated that the present situation is not safe. The search for new solutions has already started and deserves a significant effort.

Main objectives

Problem

Action

Priority

Required effort

Interaction with orography

"Surface-pressure tendency" coupling

"Orography" coupling

HIGH

LOW

6 months

4 months

Spectral coupling

Use of large-scale spectral information

Combination with Davies'scheme

Case studies and tuning

MEDIUM

18 months

Time-interpolation

Further investigation / comparison of the present schemes, including the interpolation of amplitude and phase-angle

MEDIUM

6 months

New prognostic variables

Design of a strategy for new variables from physics; introduction whenever required

HIGH

1 month

Towards higher resolution

Pseudo-radiative scheme

Two-way nesting

Non-hydrostatic variables

Case studies

LOW

LOW

LOW

MEDIUM

6 months

18 months

2 months

12 months

Choices for data assimilation

Comparison of present choices

Spectral coupling and 4d-var

LOW

Interaction with orography

The problem of the generation of noise (under the form of a constant spurious source of gravity waves) through the coupling process is to be addressed. If the orography of the coupling and coupled models is not the same, the flow updated by large-scale fields at the lateral boundaries of the coupled model will always have to adapt to the small-scale orography, thus potentially sustaining a permanent gravity-waves source. This is especially true for the surface pressure field, which strongly reflects the orography.

Coupling the tendency of surface pressure rather than surface-pressure itself should be a solution, since the main part of the coupling imbalance is in the surface-pressure field. Updating only the surface-pressure tendency is assumed to better preserve the small-scale balanced structure of the LAM surface-pressure field. The first developments and some basic tests have been performed. However a deeper analysis of the impact is necessary and the scheme may require finer tuning.

Another issue is the so-called orography coupling. The idea is to alleviate this drawback by progressively replacing the initial small-scale orography by the one of the coupling model when approaching the lateral boundaries (within the coupling zone). This might help for other prognostic variables, but is rather tricky and requires a strong coordination with the other developments concerning orography (required by dynamics or physics).

Spectral coupling

The failures in forecasting the "1999 Christmas' storms", common to several limited area models in Europe, raised the problem of the relevance of the Davies' relaxation scheme. Moreover the ALADIN coupling files contain the whole "large-scale" spectral data, and it seems attractive to use all the information available over the domain rather than only gridpoint values inside the small coupling zone. A spectral coupling scheme based on this is under development. The aim is to better capture incoming signals without losing the sponge effect of Davies scheme to damp out spurious wave reflections or wave re-entering. This also involves changes in time-interpolation, handling phase-angles and amplitudes rather the standard spectral coefficients, to get a better description of pattern tracks. The first 2d experiments were quite promising, but the first "real size" test failed. A lot of work is still required for development, case studies and tuning.

Time-interpolation

Three time-interpolation schemes for lateral boundary conditions are now available : linear, quadratic, and linear-accelerated. The latter needs some more validation, through 3d case studies. Afterwards choices for coupling methods and frequencies in operational applications might be re-examined. Comparisons should also be performed including the developments related to spectral coupling.

New prognostic variables

The introduction of new prognostic variables in ALADIN, induced by developments in physics and dynamics, requires adding new large scale forcing terms to the coupling fields. Developments of this should be rather straightforward.

Towards higher resolution

When going to mesoscale (i.e. horizontal resolutions of 2.5-5 km) the Davies' relaxation scheme might prove insufficient. A combination with the so-called pseudo-radiative scheme is planned on these scales.

In case of multiple nesting, interactive coupling (either one- or two-way) may be interesting. But this is a really difficult research topic in the spectral framework.

In the current solution, NH fields are coupled to the large-scale values. This could be not necessary since these fields adapt themselves to the local flow very rapidly. A study to decide if this really matters should be undertaken.

To end with, more case studies with embedded models at increasing resolution are required to track problems and define a set of "optimal" strategies.

Choices for data assimilation

Spectral coupling is also expected to be of great help for the future ALADIN 4d-var assimilation. The trajectory of 4d-var is to be fitted to observations only on that scales not yet analysed by the coupling model. Trajectory is short enough to avoid the wave-reflection problem, and on the other hand spectral coupling takes care that projection of "variational trajectories" onto large scales fully evolves according to the coupling model.

The present choices for coupling within initialization, blending or 3d-var should be compared and differences understood. However this involves only simple and short studies.

DYNAMICS

Introduction

The situation of research and development in numerics / dynamics for ALADIN has been greatly improved in the recent couple of years. There has been a positive feed-back between the development of a simple experimental framework (2d and 1d models for academic situations), more theoretical understanding of the behaviour of numerical algorithms, more distance exchanges, emergence of new ideas, and capability of fast testing of these new ideas. This fruitful strategy must be preserved for the next years.

Main objectives

Scientific topic

Priority

Required effort

Subtopics

Hydrostatic dynamics : improved semi-Lagrangian schemes

MEDIUM

6 person × months

Uniformly accelerated scheme

Predictor-corrector (P/C) scheme

NH dynamics : three-time-level semi-Lagrangian (3TL) schemes

HIGH

12 person × months

Optimal choice of model variables

P/C scheme

NH dynamics : two-time-level semi-Lagrangian (2TL) scheme

HIGH

12 person × months

Properties of P/C scheme

Refinement in the choice of model variables

Use of decentering

Bottom boundary condition (NH) & related discretisation problems

HIGH

12 person × months

Optimal discretisation

General improvement of the current scheme

Diabatic forcing)

HIGH

6 person × months

Strategy for the diabatic forcing

Adaptation to the final choice of prognostic variables (NH)

Orographic forcing

HIGH

MEDIUM

12 person × months

Optimal filtering of the orography

Resonance problem in NH

Relaxation of the thin layer hypothesis

MEDIUM

12 person × months

Implementation and test in ALADIN

Extension to NH

Radiative upper boundary condition

MEDIUM

12 person × months

Feasibility study : analysis, academic 2d tests

Adaptation to ALADIN NH

Control of the hydrostatic version

Horizontal diffusion

HIGH

6 person × months

Horizontal diffusion using semi-Lagrangian interpolators

Gridpoint treatment of humidity

(NH : non-hydrostatic model)

Hydrostatic dynamics : improved semi-Lagrangian schemes

An alternative, more accurate, scheme was designed and its advantage on the current one demonstrated. It remains to be evaluated in comparison with other proposed improvements, such as the "Predictor/Corrector" scheme. This new time-stepping algorithm has been implemented in ARPEGE by ECMWF. It is potentially more stable than the classical semi-implicit scheme, especially when two-time-level discretisations are used. This approach seems attractive for solving the instabilities observed sometimes with the current operational 2TL time-extrapolating scheme, and is being introduced in ALADIN. Research on this latter topic will cover simultaneously hydrostatic and non-hydrostatic dynamics.

Non-hydrostatic dynamics : three-time-level semi-Lagrangian schemes

Thanks to the recent discovery of the importance of the choice of variables on the stability of the temporal scheme, a major effort is needed to examine the best optimal choice of these prognostic variables. This has to be done considering both stability and precision criteria. The investigation will be made first in a simplified 2d (vertical plane) environment prior to 3d tests (all made first with the adiabatic model). Both the semi-implicit and predictor/corrector schemes shall be tested. The final choice of prognostic variables is crucial for the following developments: TL/AD code, diabatic forcing, relaxation of the thin layer hypothesis, etc ...

Non-hydrostatic dynamics : two-time-level semi-Lagrangian scheme

The two-time-level scheme has different stability properties than the three-time-level scheme. Therefore the optimal choice for prognostic variables should be further refined for this scheme. From the currently known facts it becomes clear that the classical semi-implicit scheme cannot provide a robust solution for the two-time-level scheme. A predictor/corrector scheme is needed. However a careful design (use of decentering, for example) is required to reduce computing costs and keep the competitiveness with the three-time-level scheme.

Bottom boundary condition in the non-hydrostatic model

The formulation of the bottom boundary condition in the NH model is known to be crucial. It mainly affects the precision of the scheme and even some crude choices may lead to instabilities. Though the current "zero-level” free-slip" bottom boundary formulation seems to be a not so bad choice, there are remaining artifacts in its current use. In addition, a more general version (yet not the true one) of the free-slip condition ought to be tested and possibly refined. There are other opened questions on the optimal discretisation which should be explored.

Diabatic forcing

There is a general problem addressed regarding the impact of the diabatic forcing on the dynamical prognostic variables of the model. Regarding the diabatic heating, normally it should act both on temperature and pressure departure. For the moment the "hydrostatic" approach is retained also in NH, so to say the diabatic heating contributes only to temperature like in HPE (Hydrostatic Primitive Equations) model (not to pressure, since this is a coordinate in HPE model). It is not really sure, whether this current approach is really satisfactory. In addition, new prognostic (or pseudo-prognostic) variables may contain temperature and also humidity. In these cases there should also be a diabatic feedback on such variables considered. It is also not very clear whether some friction should act on the vertical velocity and it should be explored.

Orographic forcing

When going to finer and finer horizontal resolutions, the orographic forcing becomes more and more important. It is a well known fact, that 2Dx orographic waves are at the origin of spurious stationary forcing. Therefore modellers try to get rid of these shortest waves by applying various filtering techniques on the orography field. In spectral models this filtering has to be done spectrally, due to the spectral representation of orography and computation of its derivatives, taking into account the aliasing and Gibbs waves problems. For the time being the retained solution is a simple truncation at 3D x wavelength. This remains to be the choice when using so-called linear grid (maximal spectral resolution corresponding to the wavelength of 2 Dx for the other variables but orography). We know today that this choice is very probably not the best one. Another problem to look at is the resonance effect when using the semi-Lagrangian scheme in NH model. This study should start from basic understanding using again a simplified academic environment.

Relaxation of the thin layer hypothesis

In a few words, the relaxation of thin layer hypothesis means to take into account the curvature of the Earth (length of horizontal elements depends on altitude) and vertical components of the Coriolis force. It is again an abandon of a widely used meteorological approximation on the way toward Navier-Stokes equations. The relaxation of thin layer hypothesis has already been coded in the ARPEGE hydrostatic model, following the adaptation to the pressure-vertical coordinate proposed by White and Bromley. The preliminary tests might be done using the ALADIN hydrostatic model (preferably in the tropics in presence of orography). The approach should be extended to the hydrostatic-pressure coordinate in NH.

Radiative upper boundary condition

Dated back to 1997, a first version of the Radiative Upper Boundary Condition (RUBC) was developed in ALADIN hydrostatic model after Herzog (based on Klemp, Durran and Bougeault analysis). This RUBC was never successfully extended to the NH version, moreover, many opened questions remained also for its behaviour in the hydrostatic model. However, a good RUBC should be very useful at fine-scale models, since it may radiate out from the model domain most of the energy of waves impinging the top model boundary and avoid its reflection back. Following newly published analysis of Purser, which usefully extends the approach of Klemp, Durran and Bougeault to the NH equations, the subject gets reopened in ALADIN. This time the study in ALADIN will start first by the characteristic equation analysis for the semi-implicit temporal scheme and if the answer is optimistic, the development will take place with first tests done again in a simple academic environment to facilitate the basic understanding of potential problems.

Horizontal diffusion

Horizontal diffusion is a very important topic but it seems to be quite neglected within ALADIN community (judging from the number of errors made by people when setting the horizontal diffusion coefficients). Though the ALADIN geometry is quite simple (there is not so big a metric horizontal change like in the stretched ARPEGE) the correct formulation of the horizontal diffusion will become more and more crucial when going to finer resolutions. From this point of view it would be highly beneficial to develop and maintain a diagnostic tool for examination of the horizontal diffusion scheme performance. There may be a growing interest to replace the existing spectral horizontal diffusion scheme by a gridpoint one based on the diffusive properties of semi-Lagrangian interpolators and acting as well on gridpoint variables. This might be especially useful for humidity fields. There are other subtopics associated with the horizontal diffusion scheme : optimal diffusion of the NH variables (pending their choice also), horizontal diffusion acting along h-surfaces in presence of sharp orography, etc ...

PHYSICS

Introduction

As for the other research topics, the strategy for developments in physics should take into account the foreseen changes in the NWP till the end of this decade, i.e. reaching mesh sizes of 2-3 km, when intense convection and some orographic waves will be explicitly represented. It should involve some divergence (but not too much !) between the physical parameterizations used in ARPEGE and ALADIN models and might benefit from the experience of Meso-NH (an anelastic model used for pure research at very high resolution at Météo-France). A larger dispersion of the interests and means of the different partners is to be considered as well.

For the transition period towards very high resolutions, the ALADIN physics should be a correct compromise at scales around 5 km, still compatible with the ARPEGE physics and prepared to include a complex representation of microphysics and 3d turbulent processes as the next step. But before the already known problems at the current resolution (around 10 km) should be cured. In connection with the march towards 4d-var assimilation at small scales in ALADIN, an improvement of the present simplified physics is also required.

Main objectives

Axis of research

Main topics

Clearly identified actions

Priority

Use of

Convection

Introduction of a prognostic convection scheme

Management of the 4 new variables

Validation over an extended set of situations

Investigating problems in the triggering of convection

Analysis of the closure and hysteresis problem

HIGH

MEDIUM

MEDIUM

HIGH

MEDIUM

new

Microphysics

Management of 2 or 3 new variables : condensed water

Further analysis of the "Functional Boxes" approach

Introduction / choice of a semi-complex microphysics

Interface with convection

Prognostic treatment of falling condensates or not?

HIGH

HIGH

HIGH

MEDIUM

MEDIUM

prognostic

Vertical diffusion,

low cloudiness, PBL, ...

Introduction of a prognostic TKE scheme (1 new variable)

Interaction with other developments concerning PBL :

* link between top of PBL fluxes and cyclogenetic activity

* noise in shallow convection

* PBL-height dependent mixing lengths

* developments in the anti-fibrillation scheme

* improvement of low-level cloudiness (diagnostic scheme)

MEDIUM

HIGH

HIGH

MEDIUM

LOW

HIGH

variables

General problems

Update of thermodynamics

Consistency with the other parameterizations

Interface with coupling, dynamics and data assimilation

Consistency with regular physics

Validation at various horizontal and vertical resolutions

LOW

HIGH

HIGH

MEDIUM

MEDIUM

Improvement

Radiation

Refinements of optical depths

Move (choice, development) to an intermediate scheme

HIGH

MEDIUM

of basic

Orography

Improved smoothing of very small scales

Management of the extension (and coupling ?) zone

Tuning of the envelope

Better description of roughness length

Investigation of feed-backs with other parameterizations

Study of local circulations

Development of new diagnostics

MEDIUM

LOW

HIGH

MEDIUM

HIGH

MEDIUM

MEDIUM

parameterizations

Surface

Parameterization of lakes

Improved description of evaporation over sea

Revisit of the z0h/z0m ratio over land

Improved databases for soil and vegetation

MEDIUM

MEDIUM

LOW

MEDIUM

Simplified regular

physical parameterizations

Tuning of diffusion

Improved description of humidity

Validation at high resolution

Consistency with the "full" physics

HIGH

MEDIUM

LOW

MEDIUM

Physics / dynamics interface

Introduction of the new variables

Interaction with the predictor / corrector approach

Interface with "externalized" parts of the physics

HIGH

MEDIUM

LOW

Case studies

Identification and study of "strange behaviour" cases

Selection and documentation of extreme situations

Validation on a wider range of situations

HIGH

MEDIUM

MEDIUM

Validation

New observations

Comparison to satellite data

Comparison to radar or lidar data

Interfaces to new field experiments

MEDIUM

LOW

MEDIUM

New methods

Design of new scores or criteria

Use of expert systems to identify fine scale structures

HIGH

LOW

Use of new prognostic variables (and changes in the concerned parameterizations)

Convection

The prognostic convection scheme developed and tested by Luc Gérard in ALADIN includes four new prognostic variables: the updraft and downdraft mass fluxes and the active mesh fractions. The scheme should be complemented by a prognostic parameterization of the suspended condensate. The first results are encouraging (a slower development of the convective activity than for the diagnostic scheme which has shown a too fast evolution) but the further validation of the scheme rises code constraint problems. Also the interaction with the prognostic condensed water constitutes a complicated issue.

Microphysics

In the ARPEGE/ALADIN community there are two different approaches for the prognostic condensed water. The so called "Functional Boxes" approach was designed in cooperation with HIRLAM to :

¨ handle the vapour, liquid and ice phases of atmospheric water

¨ allow to test independently various parameterizations

¨ be portable to other NWP models

A first version is now available and was tested in the 1d version of ARPEGE/ALADIN, using GATE observations. The simulations have shown a problem with the ice/water phase transitions.

The scheme designed by Philippe Lopez, of a moderate level of complexity, is able to treat separately both atmospheric cloud condensate and precipitation content in a prognostic way. Even if it was originally designed for the future variational assimilation of cloud and precipitation observation it surely could be used in short-range numerical weather prediction.

The choice between the "Functional Boxes" approach (with an unified forcing for the microphysics part but involving problems for melting/freezing processes), the scheme of Ph. Lopez (treating only the large scale condensation) and a complete rewriting of the precipitation processes parameterization (with significant tuning efforts) seems to be very difficult. Additional problems like the necessity of prognostic variables for precipitating condensate and cloudiness or the way of the use of the condensate variables inside the deep convection scheme are to be considered.

"Turbulent processes"

The implementation of a prognostic TKE (Turbulent Kinetic Energy) scheme appears less problematic. Unless major problems arise, the CBR (Cuxart Bougeault Redelsperger) scheme will be ported to ALADIN, the only initial problem being that of the choice of the exact version of this most-used scheme. Problems of compatibility with the parameterisation exchanges and of a technical equivalent to the anti-fibrillation scheme(s) will surely have to be treated specifically for the NWP applications.

Considering all the above, the ALADIN community will face a huge challenge to (i) find the right level of complexity for the 5 km target, (ii) harmonise the progress along three or four delicate paths and (iii) tune the new choices to beat the rather well optimized "classical" set (that will still also progress a bit). It is not clear whether our current potential on physical parameterisation will be up to that task.

Improvement of the low cloud representation

The diagnosis and problems identification are going on in a Météo-France internal PhD thesis. An improved solution for the current cloud/condensate diagnostic scheme can be expected, but like for PBL and radiation it will probably be the end of the current phase, before the study of a new scheme starts. The latter will involve interactions with all above-mentioned prognostic questions and will therefore call for a lot of coordination.

PBL

If one gets away from the new "prognostic" aspects, there remain (at least) two on-going studies of interest (a kind of contribution to a CYCORA-quattro): study of the relationship between turbulent fluxes in deeply stable PBL situations and cyclogenetic activity, which could lead to a way to make the mixing length approach compatible with the local depth of the PBL; removal of remaining fibrillations by a time-smoothing term introduced in the shallow convection part of the turbulent diffusion parameterization. This is likely to be the last step before the big jump to a more complex physics of vertical turbulent exchanges (friction as well as dry, moist, shallow and deep convection together) and, given the time and sum of effort it will take to tune the new schemes to be superior to the old ones, the current trend towards not following any more the updates of the physics in the places where research on new parameterisation schemes could take place is extremely worrying. Even more than for the cloud scheme, a transversal experimental framework is impossible to imagine in a structure where even the basic progresses are considered as too demanding to be implemented operationally.

Improvement of basic parameterizations

Radiation

The problem of not enough surface downward long-wave radiation led to the necessity of the re-tuning of the transmission function. It seems that the solution of modified algorithms for the computation of the optical depth (work of Roger Randriamampianina) does not work for higher vertical resolutions (passing from 31 to 41 vertical levels). Further to an even simpler tuning improvement, another scheme of intermediate complexity would have to be used later, the exact choice and its constraints probably being driven by data assimilation considerations.

Orographic forcing

The operational use of the ARPEGE model has revealed still some lack of drag over Rocky Mountains; additional problems have appeared since the use of CYCORA package and it is probable that not all of them have been cured by the undoubtedly positive updates of CYCORA-bis and (perhaps) CYCORA-ter. It seems that a tool to point out the momentum imbalances and to evaluate the different tunings should be developed. Parallely it appears necessary to create a (distributed) team to work on the envelope problems, on the roughness, blocking and lift/drag gravity wave effects.

On a more pragmatic path, the solution of the vexing problem of too much precipitations on the mountain upslopes and on the land-fall areas will have to be attacked specifically.

Surface parameterization

Snow cover: improvement/updating the snow cover analysis, improvement of the description of the snow coverage and the associated albedo are well in preparation. There will probably be a problem of harmonisation of procedures at the ARPEGE/ALADIN interface, like when ISBA was implemented. This is linked to two other problems: there is a requirement for the externalization of the ISBA scheme in the framework of the AROME project; the data assimilation aspects in ALADIN must take into account the link between blending and analysis for the surface (this problem could disappear if all "blended" applications would move rapidly to a full surface analysis implementation at the ALADIN level!).

Lake representation. There are already available tools to improve the lakes temperature (for 923 and 927 configurations) and parameterization of the temperature evolution is considered by using a simplified model embedded in ISBA or by coupling with Hostetler lake model. The proposal is thus to continue the developments in a similar line.

Evaporation over sea by weak mean wind should take into account the impact of the convective generated turbulence together with some retunings of the formulation of the roughness length (that currently treats only the dry part of the problem).

Simplified physics

A whole set of simplified physical parameterizations (including tangent linear and adjoint versions) is necessary in 4d-var analysis to take into account the influence of physical processes on the assimilation period and to assimilate diabatic observations such as radar reflectivities, cloudy radiances, ... For these reasons the simplified physics will play an increasing role in a 4d-var mesoscale analysis. But the necessity that the simplified physics should be as regular as possible to avoid instabilities to occur in linear models while remaining as close as possible from the full physical parameterizations will require more and more attention when going at higher resolution.

So, we should continue to increase the coherence between full and simplified physics and to investigate the quality of our simplified physics at higher resolution with a specific attention to vertical diffusion. An important work should be also done on the simplified "large scale precipitation" and "convection" schemes to start the work on diabatic initialization.

Interface physics-dynamics

In the ALATNET framework, a study has already been started on the definition of "optimality" for the physics-dynamics interface, especially in view of the future move to a predictor-corrector approach to the time-stepping procedure. There might also be another more theoretical face of the study of this topic at another place. It is also supposed that a person from Météo France will be involved in this topic initially for the implementation of the Meso-NH physics in ALADIN in the framework of the AROME project (emphasis on new prognostic variables thus). The problem will be here not to search for work force but to avoid the birth of contradicting paths.

On another more pragmatic side, it is very likely that all the planned work will create situations of numerical complex unphysical behaviour (like recently in a strange chain of three or four occurrences within two months. Here it is not only the workforce that is likely to be missing but also the testing/proposing capacity to find practical and viable solutions to such problems.

Validation

The fact that standard objective criteria are not always able to demonstrate the differences between ARPEGE and ALADIN forecasts underlines the difficulties of the validation of a model at high resolution. New strategies are to be defined, and the corresponding tools developed. This topic may be considered as the counterpart of the problem of mesoscale verification.

Several directions may be investigated :

· use of imagery data, either via a "model to satellite" approach (a prototype version of such tools is already available), or by comparison with radar or lidar data (as scheduled in the CLOUDNET program);

· comparison to more field experiments, including the development of interfaces to the 1d version of the model;

· use of statistical scores rather than standard ones, to take into account extreme events;

· identification of phenomenon-oriented criteria,  use of expert systems to identify the fine scale structures;

· more careful selection and documentation of extreme cases or strange behaviours of the model;

· and validation over wider sets of situations.

DATA ASSIMILATION : METHODS

Introduction

This part deals with the various facets of variational data assimilation. The optimal-interpolation (O.I.) code, CANARI, will not disappear within the end of this new research plan. It will be maintained, used operationally, and improved for the purposes of surface analysis (assimilation of soil/surface variables: snow cover, temperature, moisture ; Diag-Pack; quality control). Presently a prototype version of 3d-var is available in ALADIN, but some important choices are still debated and experts are really too few. Within 2004 we expect to obtain a continuous variational data assimilation system, including a prototype version of 4d-var (for research, not optimized). The maintenance of scripts is primordial here.

There is no estimation of the required effort to perform the described work in upperair data assimilation (methods and observations), because of the numerous possible choices. However one may estimate the minimum cost of an operational implementation of 3d-var using satellite data to 19 person × months, if starting from "nothing" (including 12 person × months of local work). A further move to 3d-FGAT is likely to require 8 person × months of common effort.

Algorithmic aspects

Main topics

Main tasks

Importance

3d-var

Use of observations at the borders of the domain

New minimization algorithms

Design of an explicit spectral blending and combination with 3d-var

Improvement of observation operators (vertical interpolations)

Choice of the time-window for the selection of observations

HIGH

LOW

MEDIUM

HIGH

HIGH

3d-FGAT

Implementation of a 4d screening

Choice of lateral boundary conditions

Choice of the time-window

MEDIUM

4d-var

Maintenance of the TL/AD code (for various research purposes)

Coding TL/AD of semi-Lagrangian schemes

Definition of coupling strategies for the various elements

Adaptation of Jc-dfi (to high resolution, to new variables)

Improvement of simplified physics

HIGH

LOW

LOW

LOW

LOW

Simplified physics

Evaluation through sensitivity studies

Evaluation and tuning at high resolution

Solving incrementality problems

Adapting observation operators to new variables

LOW

A-posteriori validation

Further tuning of statistics (observations + background) for 3d-var

Extension of diagnostic tools to 4d-var

HIGH

LOW

TL/AD tools

Maintenance of the TL/AD code (reminder)

Use in the design of the TL/AD code (e.g. LBC, NH, new variables)

Use to study nonlinearity problems (e.g. in simplified physics)

Predictability studies

HIGH

MEDIUM

MEDIUM

LOW

Var-Pack

Watch

LOW

A prototype version of 3d-var is already available, but the problem of lateral boundaries is not fully solved and some main issues are still debated : choice of background error statistics, improvement of observation error statistics, cycling (which combination with which type of blending, time-window, length of the cycle, part of initialization, ...). Moreover the characteristics of the domain (extension, resolution, nesting) and the density of observations are likely to influence such choices. Some optimization may also be required to reduce computational costs. As an example, an "explicit" blending, based on a raw combination of spectral coefficients, is expected to be less expensive than "dfi"-blending, based on digital filter initialization. New minimization algorithms might be required in case more sophisticated observation operators are used i.e. a non-quadratic Jo (the background term, Jb, will remain quadratic thanks to the choice of a -potentially- incremental formulation).

3d-FGAT (FGAT for “First Guess at the Appropriate Time) is the next step, before going to 4d-var itself. Issues to be addressed are the 4d screening (comparing asynoptic observations with the appropriate background - at the right time- gives the possibility to calculate the distance to observations more accurately), the choice of the lateral boundary conditions for the corresponding trajectory, and the length of the time-window.

4d-var won't be considered before 2004, and its optimization is not a priority. Its progress relies on research for the following items: semi-Lagrangian advection of the tangent linear (TL) and adjoint (AD) models (to reduce the cost of 4d-var), choice of a coupling strategy (for the numerous integrations of the full, TL and AD models), refinement of the weak-constraint term (Jc, based on digital filters), use of a stable simplified physics suitable for small scales (including its evaluation and tuning at mesoscale, the study of problems related to incrementality, and possibly the use of new observation operators for water-related variables), and the evaluation of costs.

A related topic is the use of the tangent linear and adjoint codes to calculate gradients and singular vectors. These tools can be used to study some aspects that are linked with 4d-var. One can mention for instance the use of a semi-Lagrangian advection scheme, the investigation of the degree of linearity of the perturbations evolution, and the use of a non-hydrostatic version of the model.

A posteriori validation is to be continued, in order to tune forecast and observation error variances. This includes the evaluation of representativeness errors, which may be smaller in a high resolution limited area model. An extension to 4d-var may be also considered, for instance in order to diagnose model errors.

Replacing the optimal interpolation code by a variational counterpart for the surface is a very heavy and tricky task, hence not considered as a priority : no developments are planned before 2004. Some scientific watch, on the link with the surface analysis and on Jb issues, is recommended. The progressive use of a coupled system "upperair 3d-var + surface O.I." will probably "naturally" lead to the identification of "phasing" problems between altitude and surface fields. The exact extension and nature of these problems will then control the type and amount of work to be devoted to this project. An a-priori evaluation is not reasonable here. Let's note that combined variational + optimal interpolation assimilation system have been used operationally in many global models for years (since 1997 in ARPEGE).

Modelisation of background / forecast errors

Main topics

Main tasks

Importance

Sampling methodology

Evaluation of the different contributions to error covariances

Ensemble analyses and forecasts with perturbed observations

Singular vector approach

HIGH

MEDIUM

LOW

Diagnostics

Heterogeneity and anisotropy

Time dependence

Nonlinear effects

HIGH

MEDIUM

MEDIUM

Jb formulation

Approaches based e.g. on diagonal blocks and wavelets

MEDIUM

New variables

Taking into account new prognostic variables (NH, cloud water, ...)

HIGH

A first identified issue is the choice of the sampling methodology to compute forecast errors, from which covariances are calculated. It has proved useful to compare different versions of the so-called NMC method, investigating the influence of different factors such as lateral boundary conditions, initial conditions, forecast ranges and mesoscale processes. The so-called lagged-NMC method was shown to be relevant for a mesoscale limited-area model. An alternative method is to perturb the observations used to define the initial state, to produce an ensemble of analyses and forecasts, from which statistics can be calculated. This allows to study and represent e.g. the evolution of analysis errors into forecast errors and the subsequent cycling. It should be tried first in ARPEGE, and considered for ALADIN later on. A comparison with other approaches such as those based on singular vectors may also be considered at a later stage.

Forecast error covariances have been studied through their average three-dimensional auto- and cross-correlations, and their vertical and scale dependencies. This is useful for instance to compare different sampling methodologies, and also to investigate the implied behaviour of the Jb term in 3d-var analysis. New features were shown to be important and must be diagnosed carefully, such as: heterogeneity and anisotropy, temporal evolution (flow-dependence, cycling of errors, seasonal dependence), nonlinear effects, ...

Generalizations of the current Jb formulation will be required, in order to possibly represent new identified features such as heterogeneity and anisotropy. Representing important correlations between different wavenumbers, either explicitly (as recently tried) or implicitly (e.g. through the use of a wavelet approach), may be considered in this perspective.

To end with, the formulation of the background term must be adapted to the most important changes in the model, in particular to the introduction of new prognostic variables. This work will actually be controlled by the scheduling of the 'pre-) operational implementation of new variables in ARPEGE or ALADIN. Furthermore, the issue might sometimes be whether including a new field in the analysis or not.

Cycling

Main topics

Main tasks

Importance

Blending

Maintenance of a reference version of dfi-blending

Adaptation to the main changes in the model

Development of double-nested blending

Comparison of "dfi" and "explicit" blending for spectral fields

HIGH

HIGH

MEDIUM

MEDIUM

Assimilation cycle

Investigating / comparing the various combinations between

3d-var, dfi, blending

Combination with surface analysis or surface blending

Moving to 3d-FGAT

Maintenance of a reference version

HIGH

MEDIUM

MEDIUM

HIGH

Frequency of 3d-Var

Evaluation through sensitivity studies

MEDIUM

Regarding the present version of blending (i.e. that operational for ALADIN/LACE), a reference, optimized and portable, version is required, including scripts, namelists, and pieces of code. It must be updated regularly, taking into account the main changes in the model (new variables, new namelist parameters, linear grid, ...) and the associated retunings documented. Double nested blending may be achieved through two different techniques : either the double use of digital filter blending, or the downstream use of an explicit blending (with a direct combination of the large-scale part of the coupling model's spectrum and of the small-scale part of the coupled model's spectrum).

A prototype version of a full ALADIN assimilation cycle, called Blend-Var and combining spectral dfi-blending, 3d-var, digital filter initialization (and very recently surface analysis) has been designed. A portable, even if not optimized, version should be written and made public, to provide a base for the march towards 3d-FGAT. Though Blend-Var was shown to give better results than some other combinations in the assimilation experiments performed so far, this issue is still debated. The use of high resolution observations, the introduction of surface analysis, an increased resolution, the use of lagged coupling files, ... or computing resources, may impose other solutions. Possible candidates speculatively are:

  • explicit blending and 3d-var,

  • 12 hour spin-up cycle (still used at CMC - Canada),

  • combination of digital filter initialisation or finalisation and 3d-var in a sequential approach,

  • "gridpoint upperair implicit" blending, i.e. use of ARPEGE analyses as coupling files along the assimilation cycle as is done operationally for Al-Bachir (Morocco),

  • alternatives combinations of dfi-blending and 3d-var (such as Var-Blend instead of Blend-Var).

The last two solutions are obviously the ones that best share with dfi-blending the concern to keep as much as possible the benefit of having a 4d-var assimilation in ARPEGE.

The definition of the frequency of 3d-var is a specific issue, closely related to the choice of the time-window for observations, the type of cycling, the characteristics of the domain and operational constraints. Examples of frequencies that may be considered are 1 hour, 3 hours and 6 hours.

As a general feeling, the work on cycling is considered as very crucial (perhaps here the ALADIN scientists pay more attention to this issue than other LAM projects do). For the sake of simplicity and common effort in the maintenance of a genuine assimilation script, a somehow homogeneous choice of one solution would be favoured. However the interactions with data frequency, domain geometry, and lateral boundary conditions, might lead to diverging solutions in the national centres. Within one or two years, it should be clearer how this far-reaching question will be answered.

DATA ASSIMILATION : OBSERVATIONS

Introduction

This is the major challenge in data assimilation for the next year. ALADIN is quite late in this domain, and it is always useful to recall that a sophisticated mesoscale data assimilation scheme is a nonsense without high resolution observations to feed it.

Main objectives

Main topics

Main observations or tasks

Importance

ODB

Maintenance and documentation

Development of new tools

HIGH

MEDIUM

Satellite data

IASI/AIRS: Improved description of surface emissivity

Raw ATOVS data : use of local data

Cloudy ATOVS data : observation operator, Jb

GPS

SSM/I

Profiler data

MEDIUM

MEDIUM

MEDIUM

LOW

LOW

LOW

Surface observations

for upperair analysis

From the less difficult or most important ones to new ones: surface pressure, 2m-relative humidity, 10m-wind, ...

MEDIUM

Aircraft data

Use of local data

HIGH

Radar

Winds

Reflectivities

LOW

MEDIUM

Pre-analysed data

(or pseudo-observations)

Pseudo-TEMP for relative humidity : case studies

Pseudo-TEMP for relative humidity : regular use

Surface data bogus

HIGH

MEDIUM

MEDIUM

Screening

Evaluation for high-density data

New data types

Time dimension (window, 4d version)

PBL fields

HIGH

MEDIUM

MEDIUM

MEDIUM

Space consistency

Combination with the use of the CANARI quality control

Variational quality control

LOW

MEDIUM

Notes

ODB : Observational Data Base (interface between the local management of observations and the model)

AIRS : Atmospheric Infra-Red Sounder

IASI : European version of Infra-Red sounder

ATOVS : Advanced Tiros Operational Vertical Sounder

HIRS : High-resolution Infrared Radiation Sounder

NESDIS : National Environmental Satellite Distribution of Information Service (a NOAA center in charge of the treatment and distribution of satellite data, located in Washington).

CMS : Centre de Météorologie Spatiale (Lannion, France)

GPS : Global Positioning System

SSM/I : microwave imagery sounder (providing total precipitable water contents and sea surface wind speeds)

VAD : a 3D wind retrieval algorithm

ODB

There are too few experts on this topic, though ODB should be used with the present and next libraries. Moreover a more detailed documentation is needed and new tools must be designed : to add / exchange local observations, for monitoring, for verification procedures, ...

Satellite data

IASI / AIRS : Work has just started to allow their use over land. These data are presently used over sea in ARPEGE, but ALADIN domains are mainly continental ones. The first step is the improvement of the description of emissivity, considering different wavelengths and the main characteristics of the surface (vegetation, moisture, ...). Afterwards experiments will mainly focus on studies with a 1d-var scheme to investigate if emissivity parameters can be retrieved from AIRS / IASI data adequately. In addition, monitoring will be performed to see if it improves the fit to HIRS / AIRS radiances.

Raw ATOVS data : Experiments may start using NESDIS data, but local data, available earlier and with a higher resolution, should be used as far as possible. Preliminary experiments in the framework of ARPEGE have shown a positive impact. Such data are available in France and Hungary, at least. Some might be soon available over Northern Europe via EUMETSAT. Besides, the introduction of the skin temperature (at the observation point) in the control variable and some setup for the use of raw radiances must be ported to ALADIN.

Cloudy raw radiances (ATOVS) : Cloudy radiances from the model (with the current cloud scheme) must be computed at observation points, to be compared with ATOVS data (HIRS). This work must be done in close cooperation with "satellite" teams (detection of clouds in observed data), and people working on background errors (introduction of statistics for cloud parameters) and in physics of course.

GPS / ground based signal delays : Assimilating these data, which provide an information on the total water content, is not a priority, although trials may be done if some good datasets are provided. Anyway these observations won't be fully available within two years.

SSM / I : Only raw radiances over sea will be used. This topic should wait at least until the developments linked to the next generation SSM/IS are done in the global model.

Profiler data : These data have a sparse spatial coverage but with a high time frequency. Therefore they will be of interest only once 3d-FGAT or 4d-var ready.

Surface observations (for upperair analysis)

The following order of priority is identified :

  • use of denser surface pressure observations,

  • improvement of vertical interpolations in PBL within the observation operators, consistently with physics,

  • use of 2 m relative humidity,

  • use of 10 m wind over land, only on flat areas first.

The use of 2 m temperature is not considered since past experiments demonstrated it is very dangerous.

Aircraft data

Though not yet providing informations on humidity, local aircraft data are likely to provide some useful mesoscale information. This was confirmed by recent assimilation experiments aiming at improving the "thinning" distance of AIREP observations; cross-validation scores of the 3d-var analysis indicated an encouraging improvement when increasing the density of assimilated AIREP observations. Therefore continuing efforts on the use of dense aircraft data is strongly supported, especially in the framework of a retuning of the time-window or a move to 3d-FGAT since the time dimension is important here.

Radar

Wind : Such data can be assimilated either as radial winds or as VAD retrievals. The already identified questions are the following : collecting data, pre-processing, estimation of error statistics. This is a huge work while Doppler radar networks are not yet available. This task is thus given a low priority.

Reflectivity : A direct assimilation into 3d-var requires some work on the observation operator, namely the calculation of reflectivities from water condensate and precipitation fields. Such calculations are likely to be easier to handle when such fields will be directly available as prognostic variables. At that time a few ALADIN experts (in radars and data assimilation) should also be available. A possible intermediate step is the use of pre-analysed data (see next section), such as humidity profiles estimated from radar reflectivities. Reflectivities may be also considered for verification (following some recent work in the HIRLAM group for instance) and for Diag-Pack.

Pre-analysed data / pseudo-observations

The underlying idea is to pre-process data, using a background forecast, to convert them into a "standard" type of observation, rather than designing a specific observation operator.

A first example is pseudo-TEMP observations of relative humidity, mainly following the work performed in other teams of CNRM. This would allow to start assimilation experiments with already existing data (''off-the-shelf'' data), prepared for field experiment studies (e.g. MAP cases).

The second one concerns the pre-processing of observations for surface analysis (pseudo-SYNOP), starting first with the assimilation of snow.

A small steering group has been nominated to further study such an approach (e.g. sensitivity to the background, error statistics, extension to other data) and, over the mid-term time-scale, one may also consider the possibility to produce such data for a regular use.

Quality Control

The screening is to be continuously developed and tuned. Its present version must be evaluated for high-density networks, both for surface and upperair (satellite or pre-analysis products) observations. The sensitivity to data rejection thresholds is to be studied for instance. New data types are to be introduced, and the appropriate length of the time window is to be determined. Taking into account the time dimension (4d-screening) will allow an accurate treatment of asynoptic data. Moreover the management of PBL observations requires a significant improvements.

The screening includes a first-guess check (comparison to the background forecast), but no test of space consistency (control of each observation against its neighbours). To introduce such a space consistency check in the screening might prove fairly difficult technically, if found to be too orthogonal to the present structure of the code. One possibility would be to combine the use of the screening and the use of the CANARI quality control (as the latter includes a space consistency check). Another possibility would be to use a variational quality control, which is known to account for space consistencies, in addition to the representation of non-Gaussian errors. Another advantage of the variational quality control is that it is already available in the ARPEGE code, and that its tuning methodology is well defined.

DATA ASSIMILATION : SURFACE

Introduction

The analyses of boundary-layer fields (T2m, H2m, V10m, ...) and of soil and surface variables is discussed separately from the rest, since based on quite different tools : O.I. versus variational analysis, gridpoint versus spectral management, quite different time-scales, major dependency on surface characteristics and physics, ... . A unified variational package for both upperair and soil/surface (and for both spectral and gridpoint) fields remains a long-term issue. A small working group will keep watching on similar experiments elsewhere, but no action should be undertaken within the next three years. Anyway surface and upperair analyses are not independent at all, and assimilation scripts must be carefully designed to optimally combine them.

Surface analysis is used in two ways :

  • to initialize soil and surface variables (temperature, moisture, snow characteristics, ...) : assimilation purpose;

  • to provide an analysed state as close as possible to all available observations : nowcasting purpose.

Most of the required improvements will benefit to both projects, even if some tunings are likely to differ.

Concerning assimilation, the main objective is an operational small-scale assimilation of soil and surface variables, based on optimal interpolation first, then on a mixed O.I./2d-Var scheme (O.I. for PBL fields and snow, 2d-var for soil moisture and temperature). An optimal combination with the initialization of upperair fields (standard, dfi-blending, explicit blending, 3d-var, FGAT, 4d-var) is to be defined at each step.

Main objectives

Main topics

Main tasks

Priority

Effort

Analysis of PBL fields, for :

* Diag-Pack

* the correction of

soil variables

Retuning of statistics (forecast and observation errors)

Geographical dependent error statistics (orography, coasts, ...)

Analysis of new fields (precipitations, visibility, cloudiness, ...)

HIGH

MEDIUM

MEDIUM

3 p.m.

6 p.m.

×12 p.m.

SST analysis

Retuning

Use of pre-processed satellite data

HIGH

MEDIUM

2 p.m.

6 p.m.

Snow analysis

Retuning of statistics, for large and small scales

Estimation of the vertical correlations for errors on snow depth

Calculation and use of a snow mask derived from satellite data

Use of pseudo-observations from local networks

Analysis / correction of new fields (albedo)

Improved climatological fields

HIGH

MEDIUM

MEDIUM

MEDIUM

LOW

MEDIUM

3 p.m.

3 p.m.

6 p.m.

3 p.m.

3 p.m.

2 p.m.

Assimilation of

soil moisture

and temperature

Reduction of the horizontal heterogeneity of soil moisture

Retuning and implementation in ALADIN

Combination with dfi-blending

Combination with 3d-var

Moving to a variational assimilation

Use of satellite data

Improved climatological fields

HIGH

HIGH

MEDIUM

MEDIUM

MEDIUM

MEDIUM

MEDIUM

2 p.m.

6 p.m.

3 p.m.

3 p.m.

12 p.m.

12 p.m.

2 p.m.

Diag-Pack

IImprovements in observations operators (vertical interpolations)

Use of aircraft, profiler, radar-wind data

Diagnostics fields (smoothing, new ones)

MEDIUM

MEDIUM

HIGH

3 p.m.

6 p.m.

3 p.m.

Quality control

Retuning screening for surface observations

MEDIUM

3 p.m.

PREDICTABILITY

Introduction

For ten years ensemble prediction systems have been developed at ECMWF, CMC and NCEP. The usefulness of such tools has been demonstrated in the medium range forecast. At the time being, ensembles begin to be used as well in the short range frame : since June 2001, the NCEP short range ensemble is used on an operational basis, and some experiments has been conducted at Météo-France (PEACE project). Some workshops were also devoted to short range ensemble forecasting. The "state of the art" in ensemble forecasting is presented below, and illustrated by a brief description of the PEACE project (short range ensemble forecasting at Météo-France). ALADIN should have at least a watch at this emerging research field for the next years.

Main features of Ensemble Forecasting

Initial state uncertainties sampling

Two main technics are presently used in the generation of the initial state perturbation.

The singular vectors (SV), which define the more unstable modes, are used in the ECMWF ensemble. Tangent linear and adjoint propagators are used to define directions where errors are likely to grow. This mathematics based method is limited by its high computational cost and the simplified model used in the SV computation (low resolution, no physics)

The BGM (Breeding the Growing Modes) is used in the NCEP ensemble. It consists on running different perturbed cycles and rescaling down the different perturbations (i.e. differences between the central unperturbed analysis and each perturbed guess) at each cycle step. The model (with full resolution and physics) selects itself the growing modes. However there is no guarantee to span orthogonal directions at it is the case with SV.

Model uncertainties sampling

Different kinds of perturbations can be used :

¨ Use of a stochastic physics (ECMWF) : random coefficients in the range 0.5 -1.5 are applied to the physical tendencies. These coefficients are updated every 6 hours and the same random number is used for all grid points inside a 10°x10° box.

¨ Use of different models (CMC)

¨ Tuning physical parameters : knowing the uncertainty around the value of some physical parameters, it is possible to tune these values inside the uncertainty interval.

Specificities for LAM Ensembles

"Perturbed lateral boundaries" : Perturbing the boundaries is important for short-range ensemble forecasting with LAMs. These perturbed lateral boundaries can be generated by using different global models or can be provided by a global ensemble prediction system.

"Initial state perturbation generation" : The singular vector computation is already implemented in Aladin. This method can then be easily tested. However it is costly and the efficiency of high resolution singular vectors is not yet proved. The NCEP short range ensemble is using a regional adaptation of the breeding method.

Verification

The quality of an ensemble system is described by the statistical characteristics of the joint distribution of forecasts and observations. One of the difficulties of ensemble verification is to get sufficient sample. Some specific tools are to be used : Brier score, giving the reliability (how the probability forecast is right) and the resolution (how the ensemble is informative) and ROC curves for the evaluation of the probabilistic forecast skill for a specific event ; Talagrand diagrams and spread skill relationship will give informations on the ensemble distribution.

The PEACE project at Météo-France

The aim of the PEACE (Prévision d'Ensemble A Courte Echéance) project is to build an efficient ARPEGE-based global ensemble for the short-range detection of strong storms in (24-48 h). The present configuration is the following :

¨ T63 singular vectors targeted over Western Europe at 24h

¨ 11 members (10 perturbed + 1 control)

¨ operational version of ARPEGE (T199c3.5) for the forecast part

For the time being, results are quite disappointing in terms of spread but encouraging in terms of Brier score and especially resolution for the 10m wind speed. Future plans will involve the breeding method (perhaps mixed with SV) with a possible use of the blending technics instead of the rescaling.

Proposals for ALADIN

There is a general agreement on the necessity to investigate Ensemble prediction with ALADIN. Unfortunately this new topic will have a far lower priority than the other research topics. Nevertheless, it was decided to look for some ways in leading cases studies on LAM ensemble. First of all it is of major importance to define the objectives of such an ensemble : detection of severe weather (heavy precipitations events ?), probabilistic information, ...

This could take the form of a PhD as it was done in the HIRLAM group. It is suggested that these experiments could be linked with the PEACE project and take advantage from the corresponding developments and experience. Moreover, the global ARPEGE Ensemble could provide perturbed lateral boundaries for the LAM ensemble. Another blocking point is the computational cost as running an ensemble could be quite expensive.

CONCLUSIONS

This new plan may look quite huge and impressive at first. However it reflects the increasing maturity of the ALADIN project through the following aspects :

© It was elaborated after consultation and by representatives of all willing Partners (almost all).

© It is reasonable, relatively less ambitious than the previous one or the ALATNET program.

© It is detailed (but hopefully not too much), to better spread informations, make everyone aware of the amplitude of each task, and maybe allow deported work to start sooner.