Learning Discrete Temporal Patterns for Time Series Forecasting
- Research topic/area
- Time Series Forecasting with Deep Learning
- Type of thesis
- Bachelor / Master
- Start time
- 05.06.2025
- Application deadline
- 31.07.2025
- Duration of the thesis
- 6 months
Description
BackgroundTraditional deep learning models for time series (e.g., LSTM, Transformer) often struggle with noisy, redundant, or high-dimensional input signals. Inspired by advances in sequence modeling, this project explores a novel intermediate representation to improve forecasting performance and interpretability.Core Idea
The thesis investigates a two-stage approach where time series data are first discretized into a learned symbolic form, followed by a sequence model trained on this compact representation. This abstraction allows the model to focus on recurring temporal motifs rather than raw data.Why It’s Exciting
- * New representation: Extract and operate on high-level temporal units.
- * Modular & extensible: Encourages transfer learning and hybrid architectures.
- * Real-world impact: Applicable to scenarios with noise, missing data, or limited labels.
- * Evaluation: Compare against existing state-of-the-art on standard forecasting benchmarks.
Learning Outcomes
- * Implement unsupervised sequence compression techniques for time series.
- * Apply sequence models on symbolic or latent representations.
- * Conduct rigorous benchmarking and performance analysis.
- * Investigate interpretability and robustness in challenging environments.
Stretch Goals (Optional)
- * Study latent attention patterns and temporal abstraction.
- * Experiment with self-supervised objectives for time series.
- * Apply the model in domains such as energy, finance, or scientific sensor data.
Requirement
- Requirements for students
-
- Solid programming skills in Python
- Basic knowledge of machine learning
- Initial experience with deep learning (e.g., PyTorch or TensorFlow)
- Interest in time series analysis and modeling
- Willingness to engage with current research literature
- Good understanding of mathematics (especially linear algebra and statistics)
- Beneficial: Experience with autoencoders or transformer models
- Faculty departments
-
- Engineering sciences
Electrical engineering & information technologies
Informatics
Energy Engineering and Management
Financial Engineering
Information System Engineering and Management
- Engineering sciences
Supervision
- Title, first name, last name
- Dr-Ing., Nicholas, Tan Jerome
- Organizational unit
- Institute for Data Processing and Electronics
- Email address
- nicholas.tanjerome@kit.edu
- Link to personal homepage/personal page
- Website
Application via email
- Application documents
-
- Cover letter
- Curriculum vitae
- Grade transcript
- Certificate of enrollment
E-Mail Address for application
Senden Sie die oben genannten Bewerbungsunterlagen bitte per Mail an nicholas.tanjerome@kit.edu
Back