e-infrastructure Roadmap for Open Science in Agriculture

A bibliometric study

The e-ROSA project seeks to build a shared vision of a future sustainable e-infrastructure for research and education in agriculture in order to promote Open Science in this field and as such contribute to addressing related societal challenges. In order to achieve this goal, e-ROSA’s first objective is to bring together the relevant scientific communities and stakeholders and engage them in the process of coelaboration of an ambitious, practical roadmap that provides the basis for the design and implementation of such an e-infrastructure in the years to come.

This website highlights the results of a bibliometric analysis conducted at a global scale in order to identify key scientists and associated research performing organisations (e.g. public research institutes, universities, Research & Development departments of private companies) that work in the field of agricultural data sources and services. If you have any comment or feedback on the bibliometric study, please use the online form.

You can access and play with the graphs:

Discover all records
Home page


Vision based Automatic Inspection of Insects in Pheromone Traps


Insects are one of the most important factors that threaten the yield efficiency in agricultural areas. Expenditures made for biological pesticides form a huge portion of the total expenses since insects massively reproduce. Observing the reproduction stages of the insects, more effective and smarter pesticizing scenarios can be achieved using biotechnical approaches such as pheromone traps rather than biological ones. Using pheromone traps, the massive reproduction is prevented since the male insects are attracted to the traps and cannot mate with the female ones. The most important disadvantage of the pheromone traps is the expensive labor cost due to the physical patrolling of the traps. Inspection of traps require expert staff who can recognize different kinds of insects. Besides the high labor costs, because of the human factor in the whole cycle, many problems occur such as errors made in counting and recording of the collected data. To overcome these problems, it is possible to integrate camera to the traps in order to lower the labor costs and assure more accurate record of the insect counts and types. Hence the visual data acquired through the traps can be inspected automatically using state of art computer vision techniques. The objective of this paper is to analyze and advance the methods that can discriminate and classify the insects in the traps under challenging illumination and environmental conditions using computer vision and machine learning algorithms. In this study, we use background subtraction and active contour models successively to separate the insects from the background and extract the outer boundary of the insects. We extract features using Hu moments (Hu), Elliptic Fourier Descriptors (EFD), Radial Distance Functions (RDF) and Local Binary Patterns (LBP). LBP features seem to outperform the rest of the features in recognition rate based on the individual performance of each method. The results from the underlying features are then fused using weighted majority voting to obtain a decision.

  • TR
  • Istanbul_Tech_Univ_ITU (TR)
Data keywords
  • rdf
  • machine learning
Agriculture keywords
  • agriculture
Data topic
  • big data
  • information systems
  • modeling
  • semantics
Document type

Inappropriate format for Document type, expected simple value but got array, please use list format

Institutions 10 co-publis
  • Istanbul_Tech_Univ_ITU (TR)
Powered by Lodex 8.20.3
logo commission europeenne
e-ROSA - e-infrastructure Roadmap for Open Science in Agriculture has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 730988.
Disclaimer: The sole responsibility of the material published in this website lies with the authors. The European Union is not responsible for any use that may be made of the information contained therein.