<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Progression on Rachid Youven Zeghlache</title><link>https://youvenz.github.io/tags/progression/</link><description>Recent content in Progression on Rachid Youven Zeghlache</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 01 Jan 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://youvenz.github.io/tags/progression/index.xml" rel="self" type="application/rss+xml"/><item><title>Longitudinal Deep Learning</title><link>https://youvenz.github.io/research/longitudinal-deep-learning/</link><pubDate>Thu, 01 Jan 2026 00:00:00 +0000</pubDate><guid>https://youvenz.github.io/research/longitudinal-deep-learning/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;Longitudinal learning models the temporal evolution of a patient&amp;rsquo;s condition from sequences of observations collected over months or years. Unlike standard classification or segmentation tasks, longitudinal models must handle irregular time intervals, missing visits, and the inherent continuity of biological processes. My PhD research produced several architectures specifically designed for this setting.&lt;/p&gt;
&lt;h2 id="key-contributions"&gt;Key contributions&lt;/h2&gt;
&lt;h3 id="latim--latent-time-models"&gt;LatiM — Latent Time Models&lt;/h3&gt;
&lt;p&gt;LatiM is a continuous-time latent variable model that represents disease state as a trajectory in a learned latent space. A Neural ODE governs how the latent state evolves between observations, enabling prediction at arbitrary future time points without discretising the timeline.&lt;/p&gt;</description></item></channel></rss>