Dr. Chariton Kalaitzidis: Opening Session

Dr. Jing M Chen: Terrestrial Carbon Cycle Estimation Based on Remote Sensing of Vegetation Structure and Function

The terrestrial carbon cycle is one of the most uncertain components of the Earth climate system. Satellite remote sensing provides indispensible information for estimating the spatiotemporal variations of the terrestrial carbon cycle. Since the advent of Earth observation from space in early 1970’s, the methodology of using satellite data for carbon cycle research has rapidly evolved. Land cover classification using optical imagery provides the essential information for this purpose and has the longest history. In early days, vegetation indices were developed to estimate the fraction of photosynthetically active radiation (FPAR) absorbed by vegetation, which allows for global mapping of the gross primary productivity (GPP) and net primary productivity (NPP) with the first order accuracy. This development was followed by mapping canopy structural parameters such as leaf area index (LAI) and clumping index (CI) using multi-spectral and multi-angle sensors. The availability of LAI and CI makes it possible to separate the vegetation in a pixel into sunlit and shaded leaf groups for the purpose of considering the biological differences during photosynthesis between these two leaf groups, and the accuracy of GPP and NPP modeling is much improved. However, in photosynthesis modeling using vegetation structural parameters, leaf biological traits that affect leaf function, such as the carboxylation rate, pigment contents and nitrogen content, are unknown and need to be prescribed. Recent research indicates that narrow band remote sensing data at various spectral regions can be utilized to obtain information on the biological function of vegetation, i.e. the photochemical reflectance index (PRI) is related to the light use efficiency (LUE), and sun-induced chlorophyll fluorescence (SIF) is proportional to GPP. Our recent research suggests that it is now feasible to retrieve the leaf chlorophyll content (LCC) at the global scale using satellite data. LCC is useful for the interpretation of PRI and SIF resulting from non-photochemical quenching that is related to leaf photosynthesis. LCC can also be used as a direct proxy for the leaf carboxylation rate. We expect that combining vegetation structural and functional parameters retrievable from satellite remote sensing would greatly improve the terrestrial carbon cycle estimation. In this lecture, I will review the various remote sensing methodologies for retrieving vegetation structural and functional parameters and introduce GPP and NPP modeling techniques that attempt to make full use of these parameters. These techniques also have implications on the estimation of soil carbon and heterotrophic respiration.

Dr. Ioannis Gitas: The Greek National Observatory of Forests (NOF)

The main goal of the research project is the establishment and pilot operation of the Greek National Observatory of Forest (NOF), aiming at the creation of an inventory of all available forest management plans of Greece, and ensuring accessibility of the digital forest data through a web-based application. This project is a collaboration between the General Directorate of Forest and Agro-environment Development and Protection (Ministry of Reconstruction of Production, Environment and Energy) (beneficiary) and the Laboratory of Forest Management and Remote Sensing (Aristotle University of Thessaloniki) (contractor), funded by the Greek State (Green Fund).
More information can be found at: http://epad.web.auth.gr/about

Dr. Nektarios Chrysoulakis: Earth Observation for Urban Climate

Dr. Susan L Ustin: Use of remote sensing to assist agricultural management

Remote sensing has been used for agricultural decision making since the 1980s. Today’s instruments and algorithm/methods have a long history of development for agricultural applications and for managing other managed landscapes.  Today we have a wide number of data choices from satellites like the ESA’s Sentinel 2 series, the USGS Landsat satellites, various airplane sensors and even data collections from small drones.  Some of these instruments are as simple as a 3-band digital camera, while others are as advanced as high fidelity imaging spectroscopy, and others include thermal infrared imagery or Lidar data.  I will provide a series of case studies that illustrate the types of agriculturally useful information that can be derived from these instruments, with an emphasis on Mediterranean-region crops.

Dr. Martin Isenburg: Hands-on Course to LiDAR processing with LAStools

More information can be found at: http://rapidlasso.com/2015/09/21/creating-dtms-from-dense-matched-points-of-uav-imagery-from-senseflys-ebee/

Dr. Claudia Notarnicola: Biophysical parameters from remotely sensed imagery as a tool for monitoring and evaluating ecosystem biodiversity

Today together with Essential Climate Variable, it has started the process of the definition of Essential Biodiversity Variable which can support the monitoring and protection of ecosystem biodiversity.

In this context, the benefits of remotely sensed imagery for biodiversity indicators are related to the synoptic view, regular and repeatable acquisitions, multi-annual time series of observations and cost-effectiveness for remote and inaccessible areas. Moreover, the new satellite sensors (such as the Sentinel family) are notably increasing the remote sensing capabilities. Such large data availability has determined the necessity to develop accurate and robust retrieval methods to improve the estimation of these variables from remotely sensed imagery.

The retrieval process of these parameters from satellite images (optical and radar) is typically a challenging task and it falls in the category of ill-posed problem. This means that beyond the non-linearity of the relationship between input features (sensor measurements) and the target variables (soil moisture, biomass, etc.), more than one combination of soil characteristics may lead to the same electromagnetic response at the sensor.  Furthermore, given a scene of interest, each system will provide information on a different aspect of the phenomena at the ground (e.g., the spatial patterns or the temporal dynamic) and could be also affected on different extents by different disturbing factors.

This suggests the importance of a synergic use of multiple available remote sensing systems (from satellite to drone based sensors) for a comprehensive, accurate and robust understanding and monitoring of the natural processes at the ground. On the other side the proper selection of the retrieval approach is a key issue.

In this context, the seminar will present currently available techniques for the retrieval of biophysical parameters from remotely sensed data addressing inversion of physical-based models, parametric and non-parametric approaches such as Bayesian procedure, Neural Networks, Support Vector Regression and Ensemble techniques. Each approach will be presented in specific applications indicating advantages, disadvantages and perspectives for the upcoming missions such as Sentinel 1 and 2. In addition the synergic use of different sensors (optical and radar) will be specifically addressed in the context of the retrieval process.

GEOSENSE – Vassilis Polychronos: Drones used for precision agriculture
As demand for food production constantly increases, farmers are more concern about the sustainable farmland management. To accurately calculate application rates one needs more detailed information of agricultural land. The ‘agricultural’ UASs (drones) play a significant role in providing spectral information about plant biochemical and biophysical parameters such as vegetation index, soil characteristic and aspects such as topography (exact mapping, inclination, basins, etc.). Drones can identify plant hot spots in a very early growing stage where immediate applications of agricultural chemicals can result in increased crop production. During the SPLITRS 2016 practical lecture, we will fly eBee and collect data over a study area. We will then analyse the measurements to show the capability of a drone to collect agricultural information. The workflow of the mission and image post processing will be also developed and performed in order to produce both topographic and NDVI plant index.

Spectral Evolution: In-situ measurements with a portable spectroradiometer and data analysis

GEOSYSTEMS HELLAS (Authorized Hexagon Solutions), Dimitrios Bliziotis, Surveying Engineer, RS Technical Manager

    • Hexagon Solutions GPS Field Measurements in MAICH Campus and GPS real time with MetricaNet

GEOSYSTEMS HELLAS (Authorized Hexagon Solutions), Charalampos Manesis, Electric Engineer, Senior Technical Manager, Dimitrios Bliziotis, Surveying Engineer, RS Technical Manager

    • Hexagon Web GIS Solutions: ERDAS Apollo
    • Imagine Photogrammetry, ERDAS Imagine Spatial Modeler, Remote Sensing process on the web
    • ERDAS Apollo & ERDAS Imagine Spatial Modeler, Remote Sensing process on the web & CEMS GEORSS

INFOREST RESEARCH (Authorized Harris- SPACE AND INTELLIGENCE SYSTEMS) – Maroulio Chanioti and Dr. Olga Sykioti (Institute for Space Applications and Remote Sensing, National Observatory of Athens)HIS indicators and classification methods using ENVI software

We will focus on a dedicated hands-on workshop during which attendees will work on and be familiarized with real image datasets. There will be two parts: familiarization with the required pre-processing steps and the second one to dedicated to vegetation classification processes. In the first part, there will be an introduction to basic image data manipulation and in particular atmospheric corrections in order to obtain surface reflectances.

In the second section, attendees will perform a classification workflow to categorize pixels in an image into many classes. In the first part of this section, they will perform an unsupervised classification with no training data, using the ISODATA code. Unsupervised classification clusters pixels in a dataset are exclusively based on statistics, without any user-defined training classes. In the second part of this section, they will create training data interactively in the dataset and use them to perform a supervised classification. Supervised classification clusters pixels in a dataset into classes based on pre-defined training data. The results of the application of different methods will be compared such as maximum likelihood, minimum distance, Mahalanobis distance and Spectral Angle Mapper (SAM). Hopefully time will allow for all the above to be carried out.