Sharing knowledge, lessons learned and data
In order to learn from Field Operational Tests (FOTs), it is crucial to collect lessons learned. Lessons learned may concern many aspects of FOTs, and each step in the FESTA methodology may lead to lessons learned. FESTA strongly recommends to document all steps of an FOT, as this is important for both interpreting the results and learning from the (good and bad) experiences.
Lessons learned may have four elements:
- What went well, and why?
- What went wrong, why, and what was the consequence?
- What could have been done better and how?
- What is the relevance of the lesson learned for other projects?
Lessons can be collected on all levels of detail, and all can be useful. Very detailed ones are useful as recipes of a kind (e.g. “software tool X is handy for the analysis”), or helping to avoid similar mistakes. If the level is too vague, the lesson may become an open door (e.g. “use sufficient resources”). However, even if a stated lesson looks rather generic, it may be useful as a pointer to a project document, where more details can be found. In addition to the tested best/good practices, also hints on potential improvements that could be tried out are useful.
We may distinguish several types of lessons learned:
- Set-up and execution of field test
- The systems and functions under evaluation, their description
- Data gathered in the project and external data, data processing and storage
- The evaluation process itself
- The evaluation methodology applied
- Evaluation tools and sensors
- The experiences and acceptance of the users
- The communication with stakeholders
- Sharing of data with project and external partners.
An example template for gathering and documenting lessons learned is given in the table below:
|Short name of the lesson learned|
|Category of lesson learned|
|FESTA V step|
|Optional: relevance for specific projects|
Set-up and execution of a field test
In setting up the test many choices and decisions have to be made, for example, selection of research questions and experimental environment. Practical issues have to be solved, such as the availability of vehicles and participants, use of resources, and permissions needed from authorities and ethical committees. Not all problems can be foreseen and new issues may arise during the execution of the field test. Learning from the experiences from other projects may be very valuable in avoiding and mitigating problems.
The systems and functions under evaluation, their description
A large variety of systems and functions may be evaluated, and a wealth of experience is available from projects, on how to deal with the detailed technical aspects. With fast-paced development, lessons learned may be too specific and become outdated soon. However, it may be very useful to search for experiences from other projects evaluating systems with similar functions.
Data gathered in the project and external data, data processing and storage
For both data gathered by the project and data from external sources there may be many issues e.g. with content and quality of the data. Some of them are not known before the analysis process. Increasing the common knowledge on data sources with sufficient quality or at least the weaknesses of different (especially public) data would save time and effort.
The evaluation process
Lessons learned concern how the evaluation was set up in an FOT, starting from setting goals to reaching the final conclusions. Examples are the way in which certain issues in the project were managed and structured, goals were set and realised, resources managed, and results were interpreted and justified.
The evaluation methodology applied
As the FESTA methodology is regularly updated and, in the future, a Common Evaluation Methodology for CCAM will be developed, lessons learned for the evaluation methodology are very valuable. Both for the purpose of improving the application of the methodology and for improving the methodology itself.
Evaluation tools and sensors
New tools and sensors are becoming rapidly available, so lessons about specific tools may become obsolete at some point. However, it may be useful to search for experiences from other projects before acquiring hard- and software tools and sensors or developing them by the project itself.
The experiences and acceptance of the users
Performing user tests with complex systems such as automated vehicles in a natural environment may be complex and challenging, and especially so, if the systems are not yet mature and are still under development. Users may even not be allowed to operate the systems themselves, or systems can only be tested in artificial circumstances. Sharing best practices on how to enhance the user experience may be very helpful.
Communication with stakeholders
The goal of an FOT is usually to produce results that are useful for stakeholders. Lessons learned about stakeholder analysis, managing the expectations of stakeholders, communicating with stakeholders and translating technical results into information that can be used by different groups of stakeholders, are very useful. These lessons are not always easily to share for confidentiality and commercial reasons.
Sharing data with project and external partners
When working with external partners, some data, results and methods can be shared, but not all. Even within a project, some data will remain confidential. Practical solutions found for data sharing within the project and outside the project are useful for others.
Finally, FESTA encourages FOTs to share their methods, tools, knowledge, and data. Re-use of existing methods and tools can make a big difference in the time that is needed to prepare an FOT. Sharing knowledge and data can offer interesting collaboration and dissemination options. Also, consulting evaluation experts can help to clarify available options regarding study design and how to make the best out of available resources.
Have feedback on this section? Let us know!
Please add your feedback in the field below.
Your feedback has been sent!
Thank you for your input.
An error occured...
Please try again later.