13000 / 13133 - Grants - 2026
13133 / 13162 - Visual Recognition Group
Projects Supported by Grants 2026
- Kordopatis-Zilos, G.: Learning a Universal Similarity Function
- 2025 - 2026, SEP-210981333
- Kúkelová, Z.: New generation of camera geometry solvers
- 2022 - 2027, GM22-23183M
- Matas, J.: Central European Digital Media Observatory 2.0
- 2024 - 2026, SEP-210959313
- Matas, J.: FunDive: Monitoring and mapping fungal diversity for nature conservation
- 2024 - 2026, SS73020004
- Matas, J.: CEDMO 2.0 NPO
- 2024 - 2026,
- Neumann, L.: Inductive Biases of Deep Neural Networks in Computer Vision
- 2024 - 2028, GM24-10738M
- Psomas, V.: Retrieval-Augmented VIsion-Language Models for Open-vocabulary LocalizatIon
- 2025 - 2027, 101205297
- Shekhovtsov, O.: Learning Quantized Neural Networks, Discrete Choices and Representations
- 2024 - 2026, GA24-12697S
- Tolias, G.: Instance-level Visual Recognition and Generation
- 2026 - 2028, GA26-24228S
- Vojíř, T.: Detection of anomalous or out-of-distribution inputs in deep neural network models
- 2025 - 2027, GA25-15993S
13133 / 13163 - Biomedical imaging algorithms
Projects Supported by Grants 2026
- Kybic, J.: Scanning of the Meiotic Spindle in Assisted Reproductive Techniques for assessment of oocyte quality and embryo ploidy evaluated by AI (SMART study)
- 2024 - 2027, NW24-08-00048
- Kybic, J.: Leveraging expert knowledge for medical image segmentation
- 2026 - 2028, GA26-23522S
13133 / 13164 - Vision for Robotics and Autonomous Systems
Projects Supported by Grants 2026
- D´Angelo, G.: Event Driven Active Vision for Object Perception
- 2024 - 2026, 101149664
- D´Angelo, G.: Neuromorphic active vision for embodied object perception
- 2026 - 2028, GA26-23432S
- Hoffmann, M.: Event-driven active vision for object perception
- 2024 - 2026, GP24-12940I
- Hoffmann, M.: Understanding infant body know-how development through baby humanoid robots
- 2025 - 2027, GA25-18113S
- Zimmermann, K.: End-to-end differentiable physics-aware architectures for self-supervised learning in robotics
- 2024 - 2026, GA24-12360S
- Zimmermann, K.: Explainable, Safe, Contact-Aware Planning and Control for Heavy Machinery Manipulation and Navigation
- 2024 - 2028, 101189836
13133 / 13166 - Analysis and Interpretation of Biomedical Data
Projects Supported by Grants 2026
- Novák, D.: Brain Dynamics
- 2024 - 2028, CZ.02.01.01/00/22_008/0004643
- Novák, D.: Digital phenotyping as a marker of clinical development in major psychiatric disorders
- 2025 - 2028, NW25-04-00081
13133 / 13167 - Multi-robot Systems
Projects Supported by Grants 2026
- Báča, T.: TOMSNAV: Topological Multi-modal and Semantic Navigation for Aerial Vehicles
- 2025 - 2029, GM25-17779M
- Pěnička, R.: TOPFLIGHT: Trajectory and Mission Planning for Agile Flight of Aerial Robots in Cluttered
- 2023 - 2027, GM23-06162M
- Pěnička, R.: Multi-robot aerial system for autonomous surveillance and inspection of large industrial complexes resistant to communication and GNSS signal failures
- 2025 - 2027, FW12010288
- Saska, M.: Multirobotic system for autonomous field operations with multisource navigation
- 2025 - 2028, OZ01020024
- Saska, M.: Létající robot využívající umělou inteligenci pro bezpečný odchyt nepovolených dronů v chráněném prostoru
- 2025 - 2028, CZ.01.01.01/01/24_063/0006766
- Saska, M.: Sensing abstract behavioural patterns to allow coordinated fast response to disruptions in multirobot systems
- 2026 - 2028, GA26-22419S
- Štěpán, P.: Robotic systems for precision agriculture
- 2023 - 2026, CZ.01.01.01/01/22_002/0000447
- Vonásek, V.: Sampling methods for motion planning and control using learned spaces
- 2026 - 2028, GA26-22606S
13133 / 13168 - Machine Learning
Projects Supported by Grants 2026
- Franc, V.: Uncertainty-Aware Machine Learning Models for Open-World Decision-Making
- 2026 - 2028, GA 26-22444S
- Navara, M.: Orthogonality and Symmetry
- 2025 - 2027, GF25-20013L
The page was created 09.03.2026 05:00:01