Master’s thesis , Designing Large-Scale Satellite-Agnostic Geospatial Foundation Models

IBM Zurich Research Laboratory
Published
WorkplaceZurich, Zurich region, Switzerland
Category
Position
Master’s thesis

Designing Large-Scale Satellite-Agnostic Geospatial Foundation Models

Ref. 2024_001

In this project, you will develop and evaluate novel ways to build satellite-agnostic geospatial foundation models that are capable of processing multi-resolution data from diverse sensors during pretraining and finetuning in a computationally efficient way. The project is executed together with the Photogrammetry and Remote Sensing Group at ETH Zurich and requires an affiliation with ETH. The project provides access to IBM HPC infrastructure.

Significant progress in the development of highly adaptable and reusable artificial intelligence (AI) models is expected to have a major impact on earth science and remote sensing. Foundation models are pre-trained on large unlabeled datasets via self-supervision and subsequently fine-tuned with small, labeled datasets for various downstream tasks. The abundance of unlabeled data from multiple satellites makes remote sensing an ideal domain for pre-training large-scale foundation models. However, recent foundation models for geospatial analysis and remote sensing applications exploit only a fraction of the available bands, typically from a single satellite or at constant resolutions. Consequently, the scientific community is increasingly interested in developing approaches to build interoperable (i.e., satellite-agnostic) models for Earth observation and remote sensing that efficiently exploit multi-sensor, multi-resolution data.

Goal

This project aims to explore transformer models to result in interoperable, multi-resolution geospatial foundation models that encode information from various bands. This is an essential step in further enhancing AI models for remote sensing to unlock diverse environmental applications.

Qualifications

  • Bachelor’s degree in computer science, deep learning or a related technical field, including equivalent practical experience
  • Python, PyTorch, Linux, GitHub/Lab
  • Problem-solving mindset: ability to explore technical literature, cross-referencing relevant information, synthesis of novel ML architectures
  • Ways to stand out: hands-on experience with HPC systems and job scheduling, prior encounters with large-scale transformer architectures, especially with positional encodings, etc.


Diversity

IBM is committed to diversity at the workplace. With us you will find an open, multicultural environment. Excellent flexible working arrangements enable all genders to strike the desired balance between their professional development and their personal lives.

How to apply

If you are interested in this exciting position, please submit your application through the link below.

!--

For technical questions, please contact please contact Dr. Angeliki Pantazi, agpzurich.ibm.com , or Dr. Armin Knoll, arkzurich.ibm.com .

References
[1] M. Dueñas-Diez and J. Pérez-Mercader, "How chemistry computes: language recognition by non-biochemical chemical automata. From finite automata to turing machines," IScience, vol. 19, pp. 514-526, 2019.
[2] Wozniak, Stanislaw, et al. "Deep learning incorporating biologically inspired neural dynamics and in-memory computing." Nat. Mach. Intell., vol. 2, June 2020, pp. 325-36, doi:10.1038/s42256-020-0187-0.
[3] Maass, Wolfgang. "Liquid state machines: motivation, theory, and applications." Computability in context: computation and logic in the real world (2011): 275-296.

--

!-- -------- Use this side column for images ----------------

--
In your application, please refer to myScience.ch and reference JobID 63374.


More job offers worldwide on jobs.myScience.org

Related Continuing Education Programs