Jørgen Agersborg defending his PhD thesis.
Image:
Petter Bjørklund / SFI Visual Intelligence

Jørgen Agersborg defending his PhD thesis.

Successful PhD defense by Jørgen Agersborg

Congratulations to Jørgen Agersborg, who successfully defended his PhD thesis at UiT The Arctic University of Norway on November 28th.

Successful PhD defense by Jørgen Agersborg

Congratulations to Jørgen Agersborg, who successfully defended his PhD thesis at UiT The Arctic University of Norway on November 28th.

By Petter Bjørklund, Communications Officer at SFI Visual Intelligence

Agersborg is a Doctoral Research Fellow at Visual Intelligence and UiT Machine Learning Group.

His thesis, titled "Methodological advancement of optical and radar remote sensing for Arctic forest-tundra ecotone vegetation analysis", presents novel machine learning methdologies for monitoring the Arctic forest-tundra ecotone. Agersborg's PhD project is part of the "Methodological advancement of Climate-Ecological Observatory for Arctic Tundra (COAT Tools)" project.

Agersborg's trial lecture was titled "Earth Observation foundational models and the embedding layers that result".

Summary of the thesis

Ecosystems in the high north are vulnerable to both the direct and indirect impacts of climate change. Rising temperatures, tall shrub encroachment and advancing treelines, increasing frequency of canopy defoliation events, and more extreme weather will all influence the Arctic forest-tundra ecotone, the transitional zone between Subarctic forest and Low Arctic tundra. Efficient large-scale monitoring of these areas requires extracting information from satellite remote sensing (RS) products using machine learning (ML) methodologies. The overarching goal of this thesis has been to develop such techniques for monitoring the Arctic forest-tundra ecotone as part of the "Methodological advancement of Climate-Ecological Observatory for Arctic Tundra" (COAT Tools) project. Several challenges need to be overcome to provide efficient ML tools applicable for satellite RS-based monitoring of this region. The frequent cloud cover limits the opportunities for imaging by multispectral optical sensors. Observations from largely weather independent synthetic aperture radar (SAR) satellites can mitigate this, especially if the full information potential of the sensor is utilised and integrated with multispectral data. The sparse and scattered distribution of vegetation classes critical for understanding this region also poses a challenge. This is aggravated by a general lack of ground reference data suitable for training ML models on satellite imagery with sufficient resolution to capture this variability. This thesis presents methodological advancements that address these challenges. A resolution-preserving method for estimating the polarimetric SAR covariance matrices using an optical guide image is demonstrated to help differentiate live from defoliated forest canopy. Further, this method is used to generate the post-event image for a semi-supervised targeted change detection mapping of forest mortality. By applying image-to-image translation, the change detection is performed against a multispectral optical image captured before the outbreak of the defoliating pest insect that caused the disturbance. A targeted semi-supervised approach is also developed to map forest and tall shrub cover. The method exploits the similar appearance of forest and tall shrub in RS imagery to first differentiate them from all other land cover types, before a second ML model is trained to distinguish between them. The map is then created from a dataset where stacks of freely available SAR and multispectral images have been combined and utilised to perform multitemporal filtering. The results in this thesis demonstrate that these methodological advancements can contribute to accurate and reliable large-scale monitoring of the Arctic forest-tundra ecotone based on SAR and multispectral optical satellite remote sensing.

Supervisors

  • Professor Michael Kampffmeyer, Department of Physics and Technology, UiT (Main Supervisor)
  • Associate Professor Stian N. Anfinsen, Norwegian Research Centre, NORCE
  • Senior Researcher Jane Uhd Jepsen, Norwegian Institute for Nature Research, NINA

Evaluation committee

  • Professor Iain H. Woodhouse, University of Edinburgh, United Kingdom (1st opponent)
  • Senior Researcher Emma Izquierdo-Verdiguier, University of Natural Resources and Life Sciences in Vienna, Austria (2nd opponent)
  • Professor Anthony Doulgeris, Department of Physics and Technology, Faculty of Science and Technology, UiT (internal member and committee leader)
Agersborg with the defense leader, supervisors, and evalutation committee. Photo: Petter Bjørklund / SFI Visual Intelligence.

Latest news

Centre-developed seismic foundation model is now open source!

April 6, 2026

The NCS model, a seismic foundation model trained on data from the Norwegian data repository for subsurface data, is now available as an open-source model, allowing anyone to download, utilize, and further develop the model.

Visual Intelligence Annual Report 2025

March 31, 2026

The Visual Intelligence Annual Report 2025, highlighting the centre's progress, activities, achieved innovations, staff, funding, and publications for 2025, is now available to read on our websites.

Visual Intelligence strengthens ties with Pioneer Centre for AI in EHR-related research

March 26, 2026

Visual Intelligence researchers contributed to the Pioneer Centre for AI workshop on Electronic Health Records research. The aim was to strengthen ties between the two centres on EHR-related research.

Nordlys: Her blir KI-studentene grillet av sin «egen» teknologi

March 24, 2026

Tre av studentene i sivilingeniør i Kunstig Intelligens ved UiT skal delta i NM i KI. Slik gikk det da de ble intervjuet ved hjelp av kunstig intelligens (News article on nordlys.no).

My Research Stay at Visual Intelligence: Rami Al-Belmpeisi

March 15, 2026

Rami Al-Belmpeisi is a PhD Research Fellow in the Visual Computing section at DTU Compute, Technical University of Denmark. He visited Visual Intelligence in Tromsø from November 2025 to February 2026.

Visual Intelligence inspires future students at the UiT Open Day

March 12, 2026

Visual Intelligence researchers came to the UiT Open Day to inform the students about UiT's study programme, and inspire them to pursue AI-related studies and career paths.

Dagsavisen: Hun lærer kunstig intelligens å forstå medisinske bilder

March 4, 2026

Elisabeth Wetzer forsker på hvordan maskiner kan lære å analysere medisinske bilder – og samtidig forstå hva legene faktisk ser etter (Norwegian news article on dagsavisen.no)

My Research Stay at Visual Intelligence: Artur Radzivil

February 12, 2026

Artur Radzivil is a PhD Research Fellow at Vilnius Gediminas Technical University. He visited Visual Intelligence in Oslo from September to November 2025.

Centre Director Robert Jenssen meets Norwegian Minister of Research and Higher Education, Sigrun Aasland

February 6, 2026

Centre Director Robert Jenssen met with Sigrun Aasland, Norwegian Minister of Research and Higher Education, alongside UiT and Aker Nscale representatives to give an update on Aker Nscale's AI Giga Factory in Narvik, Norway.

Visual Intelligence represented at Arctic Frontiers 2026

February 4, 2026

Visual Intelligence was represented by Centre Director Robert Jenssen, Associate Professor Kristoffer Wickstrøm, and VI Alumna Sara Björk at the Arctic Frontiers side session "How can AI and satellite collaboration strengthen Arctic resilience?".