fi3271 plan for 2024-2
FI3271 Data Analysis with Machine Learning is a 3-hours course given in about 15-16 weeks. It is a part of 2024 curriculum .
topics
- Computational thinking and algorithm design
- Concepts in machine learning: supervised and unsupervised learning
- Probability distributions and their applications
- Linear models for regression and classification
- Sampling methods and their role in data analysis
- Artificial neural networks: structure and training
- Gaussian process regression for atomic force field modeling
- Genetic algorithms and evolutionary computation
- Bayesian optimization and its applications
learning outcome
- Understand and explain the basic concepts of computational thinking
- Design algorithms to solve problems involving physical systems and data
- Apply machine learning techniques to analyze data from physical systems
- Present scientific findings clearly in both written reports and oral presentations
- Collaborate effectively in teams and work independently when needed
conducted plan (fm)
| Week | Topic ↓ Subtopic |
|---|---|
| Computational thinking and algorithm design | |
| 01.1 | Concept of computational thinking |
| Abstraction and decomposition | |
| Pattern recognition | |
| 01.2 | Algorithmic thinking |
| Sampling methods and their role in data analysis | |
| 02.1 | Data mining |
| Data preprocessing | |
| Linear models for regression and classification | |
| 02.2 | Linear regression |
| 04.1 | Classification with Support Vector Machine (SVM) |
| Quiz | |
| 04.2 | Support Vector Machine kernels |
| Concepts in machine learning: supervised and unsupervised learning | |
| 03.1 | Principal Component Analysis 1 |
| 03.2 | Principal Component Analysis 2 |
| 05.1 | Classification with k-nearest neighbors (k-NN) algorithm 1 |
| 05.2 | Classification with k-nearest neighbors (k-NN) algorithm 2 |
| Artificial neural networks: structure and training | |
| 06.1 | Concept of Artificial Neural Network (ANN) |
| 06.2 | Architecture of Multi-Layer Perceptron (MLP) 1 |
| 07.1 | Architecture of Multi-Layer Perceptron (MLP) 2 |
| 07.2 | ANN basic method hands-on |
| 08.1 | Midterm |
| 08.2 | ANN hierarchy design |
| Independent assignment |
tentative plan (sv)
| Week | Topic ↓ Subtopic |
|---|---|
| Probability distributions and their applications | |
| 09.1 | Introduction to probability distributions |
| Common probability distributions in Machine Learning | |
| 09.2 | Applications in Machine Learning |
| Gaussian process regression for atomic force field modeling | |
| 10.1 | Background concepts |
| Representing atomic environments | |
| Model training and prediction | |
| Force field construction | |
| 10.2 | applications |
| Software and tools | |
| Limitations and challenges | |
| Genetic algorithms and evolutionary computation | |
| 11.1 | Foundations of Evolutionary Computation |
| Genetic Algorithms (GAs) | |
| Evolutionary Strategies (ES) | |
| Genetic Programming (GP) | |
| Differential Evolution (DE) | |
| Other Evolutionary Algorithms | |
| Hybrid and Memetic Algorithms | |
| 11.2 | Theoretical Aspects |
| Applications | |
| Tools and libraries | |
| Benchmark problems | |
| Recent advances and research trends | |
| Bayesian optimization and its applications | |
| 12.1 | Foundations of Bayesian optimization |
| Advanced techniques | |
| Algorithmic implementations | |
| 12.2 | Applications |
| Applications in Machine Learning | |
| Applications in Science and Engineering | |
| Applications in Business and Operations | |
| Software and tools | |
| Recent research trends | |
| Research-Based Learning (RBL) | |
| 13.1 | Topics proposing and discussion |
| Working group creation | |
| 13.2 | Short presentasion |
| 14.1 | Progress report presentation 1 |
| 14.2 | Progress report presentation 2 |
| 15.1 | Progress report presentation 3 |
| 15.2 | Progress report presentation 4 |
| 16.1 | Final report presentation |
| 16.2 | Publishing final report on Medium, OSF, YouTube |
notes
- Information from SIX are gathered and mixed with lecturers discussion.
- There four references from course syllabus 1, 2, 3, 4, where the first two are main references, while the others are additional ones.
- There is an additional reference for information related to probability distributions 5.
refs
Christopher M. Bishop, “Pattern Recognition and Machine Learning”, 1st edition, Springer, 2006, url https://isbnsearch.org/isbn/9780387310732 2dn86 [20250420]. ↩︎
S. Haykin, “Neural Networks and Learning Machines”, 3rd edition, Pearson Education, 2009, url https://isbnsearch.org/isbn/9780131471399 b69e4 [20250420]. ↩︎
Stuart J. Russell, Peter Norvig, “Artificial Intelligence: A Modern Approach”, 3rd edition, Pearson Education, 2016, url https://isbnsearch.org/isbn/9780136042594 s3keu [20250420]. ↩︎
Alberto Artasanchez, Prateek Joshi, “Artificial Intelligence with Python: Your complete guide to building intelligent apps using Python 3.x”, 2nd edition, Packt Publishing, 2020, url https://isbnsearch.org/isbn/9781839219535 eb3p2 [20250420]. ↩︎
T. T. Soong, “Fundamentals of Probability and Statistics for Engineers”, Wiley, 1st edition, 2004, url https://isbnsearch.org/isbn/9780470868140 [20250420]. ↩︎