https://www.fabiz.ase.ro/bip-2025/
Course Material: https://quantinar.com/coursecontent/908/natural-language-processing
This course traced the remarkable 70-year journey from Claude Shannon's 1950 insight that language has predictable structure to today's large language models. Beginning with simple word counting (n-grams) that powered early predictive text and achieving 60% accuracy in machine translation, the course revealed the fundamental limitation: treating words as isolated symbols prevented generalization. The 2003 breakthrough of representing words as vectors capturing semantic meaning enabled neural language models. Through hands-on exploration using Shakespeare's texts, students discovered how transformers revolutionized NLP by solving the context problem through attention mechanisms - allowing words to dynamically exchange information based on relevance. The course culminated in building a minimal transformer from scratch, visualizing how tokens journey through embeddings, multi-head attention, and feed-forward layers to generate predictions. By reducing embedding dimensions to 3 for geometric intuition and tracking shapes through every transformation (input → function → output), students gained deep understanding of how the simple task of predicting "what comes next" evolved into systems like GPT that demonstrate emergent intelligence. The key insight: when prediction becomes perfect enough across enough text, it becomes indistinguishable from understanding.
Event: FABIZ Business and Technology Blended Intensive Program (BIP) 2025
Theme: Data without Borders - Building AI competence across borders
Location: Mamaia, Constanta, Romania
Date: August 24-31, 2025
Organizing Team
- Assoc. Prof. Dr. Sorin Anagnoste - Vice-Dean for Academic Affairs, FABIZ
Program Coordination
Organizing Institution
- Faculty of Business Administration in Foreign Languages (FABIZ)
Bucharest University of Economic Studies, Romania
- Prof. Dr. Wolfgang Härdle - Humboldt University of Berlin
Digital Economy & Decision Analytics
- Prof. Dr. Jörg Osterrieder - University of Twente
Machine Learning for Finance with Applications / Natural Language Processing
- Dr. Ruting Wang - City University of Hong Kong
Machine Learning in Financial Risk
- Prof. Sut I Wong - BI Norwegian Business School
Cultivating a Digital Mindset for Tech Leadership*
- Prof. Dimitris Karlis - Athens University of Economics and Business
Sports Analytics
- Dr. Adriana Davidescu, Diana Agafitei, Bianca Bolboasa - Bucharest University of Economic Studies
From Data to Insight: Transforming Databases into Applied Economic Research*
Industry Speakers
- Iulian Serban & Ana-Maria Preda - Maspex
Transforming Data into Meaning: Demand Planning
- Loredana Vasile - Procter & Gamble
Why is Data Science Your Best Friend?*
- Marius Antonie - EY
IA & AI in Business
Overview
Delivered a comprehensive lecture series exploring how the simple task of predicting the next word evolved into modern artificial intelligence. The course was part of an international summer school bringing together students from:
- Humboldt University of Berlin - University of Twente - City University of Hong Kong - BI Norwegian Business School - Athens University of Economics and Business - Bucharest University of Economic Studies
The Foundations
- Why next-word prediction matters for AI
- First attempts using word counting (n-grams)
- Understanding fundamental limitations
- Hands-on with Shakespeare texts
Smarter Prediction
- Word embeddings - representing meaning as vectors
- Context-dependent understanding (RNNs)
- Attention mechanisms - all context matters
- The transformer breakthrough
Prediction at Scale
- Learning from human text at scale
- When prediction becomes intelligence
- GPT and modern language models
- Real-world applications and implications
Educational Innovation
Interactive Learning Materials
Jupyter Notebooks:
1. Simple n-grams and text generation
2. Word embeddings and vector spaces
3. Simple neural networks for NLP
4. Comparing NLP methods
5. Token's journey through transformers
6. Transformers in 3D - visual journey
7. Simplified transformer implementation
8. How transformers learn - training process
Key Features
- Intuition over Math: Complex concepts made accessible through visualization
- Hands-on Coding: Every concept demonstrated with working Python code
- Progressive Complexity: From counting words to understanding transformers
- Real Data: Using Shakespeare's works as primary dataset
Learning Outcomes
Students gained practical understanding of:
- How language models evolved from simple statistics to neural networks
- The transformer architecture that powers ChatGPT and modern AI
- Word embeddings and semantic representation
- Attention mechanisms and context understanding
- The scaling laws that led to large language models
- Practical implementation skills in Python
International Collaboration
The summer school featured:
- Daily lectures from international speakers
- Business workshops with corporate partners (Maspex, P&G, EY)
- Cultural exchange and networking opportunities
- Certificate with ECTS credits
The course successfully bridged theoretical understanding with hands-on implementation, making advanced NLP concepts accessible to Bachelor-level students through innovative visualization and interactive learning.
All teaching materials demonstrate how the simple task of predicting the next word evolved into the artificial intelligence systems transforming our world today.