Errata#

Last regenerated: 2026-04-27 16:50 UTC

This page lists every error a reader (or the author) has found in the course material, together with the correction and the date the fix was deployed. We maintain it because:

  1. Students reading the course in different semesters should be able to see what changed since they last visited.

  2. A public errata sets the right expectation: even careful courses iterate, and reporting issues is welcomed.

  3. The frequency of errata in a chapter is a useful signal about which chapters might need a rewrite.

The errata page itself is built from a structured source file (errata.yml) by the script scripts/render_errata.py, so every entry here is also queryable as machine-readable data.

How to report#

Email nasqret@gmail.com with:

  • The chapter / applet / page where the problem appears

  • A short description of what is wrong (Polish or English both fine)

  • A suggested correction if you have one (not required)

  • How you would like to be credited — by name, by initials, by a handle, or anonymously

Language policy

Bug reports welcome in Polish or English. The errata page is in English; entries originally about Polish content quote the Polish text in the description.

Credit policy

Reporters are credited by name in the entry and in the “Contributors” section unless they ask to remain anonymous, ask for initials only, or use a handle. Set the credit field per-entry to “name”, “anonymous”, or a custom string. We do not publish email addresses.

Reward policy

No monetary reward (sorry, Knuth). What you get is name credit, a permanent record of your contribution, and the satisfaction of making the course better for the next cohort.

Summary#

Total entries: 8   ·  Currently open (unfixed): 0   ·  Years covered: 2026 (8)

By severity:

Severity

Count

math-error

2

code-bug

2

typo

3

clarification

1

Chapters with at least one entry (higher counts may indicate chapters needing rewrite):

  • Chapter 37: 2

  • Chapter 27: 1

  • Chapter 32: 1

Recent changes (newest first)#

E001 — typo · applets/knowledge-test.html (and moodle_quiz_gift_pl.txt) — Q24

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The Polish word “hiperpłaszczyzna” (hyperplane) was misspelled as “hiperprzaszczyzna” (rz/ł confusion) in the Polish quiz. The lemma form was caught earlier; the declined form (“hiperprzaszczyźnie”) survived a sweep and was found in this round.

Original:

hiperprzaszczyźnie


**Correction:**

hiperpłaszczyźnie


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

E002 — math-error · moodle_quiz_gift_pl.txt + moodle_quiz_gift.txt — Q38 (BP2)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The correct-answer LaTeX in the GIFT export of Q38 (backpropagation equation BP2) was truncated mid-formula at “\sigma”, with the missing “’(\mathbf{z}^{(l)}))$\(" cutting the formula short and leaving the \)$ unclosed. One distractor had the same defect. The HTML version of the quiz was correct; the bug only affected the Moodle GIFT upload, where it would have rendered as broken math in the Moodle question bank.

Correction:

Restored the full expression in both halves; bash $$ pairs now balance.


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

E003 — typo · applets/knowledge-test.html — Q27

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

“Nakradające” (not a Polish word) → “Nakładające” (overlapping) in the explanation of why overlapping convex hulls preclude linear separation.

Original:

Nakradające się otoczki wypukłe


**Correction:**

Nakładające się otoczki wypukłe


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

E004 — typo · applets/knowledge-test.html — Q45 distractor B

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

“tysięckrotna” (vowel typo from “tysiąc”/“thousand”) → “tysiąckrotna” in a quiz distractor about gradient decay.

Commit: eb69101

E005 — clarification · §37.5 + §40.7

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The accuracy comparison tables in Chapter 37 (Bahdanau attention) and Chapter 40 (Transformer) presented vanilla_baseline and bahdanau_baseline numbers as if measured live in those notebooks, but they were static numbers reproduced from Chapter 36/37 measurements. Comments in the code now explicitly state which row is measured live (the current chapter’s model) and which rows are reproduced from earlier chapters.

Commit: d95408a

E006 — math-error · §37.5 (also §38.6 and §40.7)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The Bahdanau-attention reverser model (Ch 37) was passing the encoder’s final hidden state h to the decoder. Under variable-length training batches with PAD tokens, h was contaminated by the GRU walking through PAD positions; at inference (single sample, no padding) h was clean. This train/inference mismatch made the model fail on short inputs (0% at length 5) while the prose claimed near-100% accuracy. Fixed by zero-init the decoder’s hidden state and letting attention provide all the source information. Same fix applied in Ch 38 (Reverser comparison). Ch 40 (Transformer) wasn’t affected (no recurrent state to leak). Result: 100% per-token accuracy across in-distribution lengths.

Commit: d95408a

E007 — code-bug · applets/{bahdanau-attention-explorer,scaled-attention-lab,qkv-explorer,transformer-visualizer}.html

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

MathJax in the four Part XI applets was configured with only inlineMath: [['\\(','\\)']], which overrode MathJax 3’s defaults. All $...$ and $$...$$ markup in the applet prose rendered as literal dollar signs. Fixed by adding both inline-math delimiters ($...$ AND \\(...\\)) and both display-math delimiters ($$...$$ AND \\[...\\]) to the MathJax config.

Commit: 7c51e35

E008 — code-bug · §32 (last code cell)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

RuntimeError: Can't call numpy() on Tensor that requires grad when extracting hidden-state activations for the diagnostic plot. Fixed by changing .numpy().detach().numpy() on the four relevant lines. The error was previously visible in the deployed page, since model.fc(...) produces a tensor with grad enabled and PyTorch refuses the implicit conversion.

Commit: 49d3834

By chapter#

Chapter 27#

E003 — typo · applets/knowledge-test.html — Q27

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

“Nakradające” (not a Polish word) → “Nakładające” (overlapping) in the explanation of why overlapping convex hulls preclude linear separation.

Original:

Nakradające się otoczki wypukłe


**Correction:**

Nakładające się otoczki wypukłe


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

Chapter 32#

E008 — code-bug · §32 (last code cell)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

RuntimeError: Can't call numpy() on Tensor that requires grad when extracting hidden-state activations for the diagnostic plot. Fixed by changing .numpy().detach().numpy() on the four relevant lines. The error was previously visible in the deployed page, since model.fc(...) produces a tensor with grad enabled and PyTorch refuses the implicit conversion.

Commit: 49d3834

Chapter 37#

E005 — clarification · §37.5 + §40.7

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The accuracy comparison tables in Chapter 37 (Bahdanau attention) and Chapter 40 (Transformer) presented vanilla_baseline and bahdanau_baseline numbers as if measured live in those notebooks, but they were static numbers reproduced from Chapter 36/37 measurements. Comments in the code now explicitly state which row is measured live (the current chapter’s model) and which rows are reproduced from earlier chapters.

Commit: d95408a

E006 — math-error · §37.5 (also §38.6 and §40.7)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The Bahdanau-attention reverser model (Ch 37) was passing the encoder’s final hidden state h to the decoder. Under variable-length training batches with PAD tokens, h was contaminated by the GRU walking through PAD positions; at inference (single sample, no padding) h was clean. This train/inference mismatch made the model fail on short inputs (0% at length 5) while the prose claimed near-100% accuracy. Fixed by zero-init the decoder’s hidden state and letting attention provide all the source information. Same fix applied in Ch 38 (Reverser comparison). Ch 40 (Transformer) wasn’t affected (no recurrent state to leak). Result: 100% per-token accuracy across in-distribution lengths.

Commit: d95408a

Non-chapter content (applets, slides, infrastructure)#

E001 — typo · applets/knowledge-test.html (and moodle_quiz_gift_pl.txt) — Q24

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The Polish word “hiperpłaszczyzna” (hyperplane) was misspelled as “hiperprzaszczyzna” (rz/ł confusion) in the Polish quiz. The lemma form was caught earlier; the declined form (“hiperprzaszczyźnie”) survived a sweep and was found in this round.

Original:

hiperprzaszczyźnie


**Correction:**

hiperpłaszczyźnie


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

E002 — math-error · moodle_quiz_gift_pl.txt + moodle_quiz_gift.txt — Q38 (BP2)

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

The correct-answer LaTeX in the GIFT export of Q38 (backpropagation equation BP2) was truncated mid-formula at “\sigma”, with the missing “’(\mathbf{z}^{(l)}))$\(" cutting the formula short and leaving the \)$ unclosed. One distractor had the same defect. The HTML version of the quiz was correct; the bug only affected the Moodle GIFT upload, where it would have rendered as broken math in the Moodle question bank.

Correction:

Restored the full expression in both halves; bash $$ pairs now balance.


**Commit:** [`eb69101`](https://github.com/nasqret/classical-foundations-ann/commit/eb69101)

E004 — typo · applets/knowledge-test.html — Q45 distractor B

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

“tysięckrotna” (vowel typo from “tysiąc”/“thousand”) → “tysiąckrotna” in a quiz distractor about gradient decay.

Commit: eb69101

E007 — code-bug · applets/{bahdanau-attention-explorer,scaled-attention-lab,qkv-explorer,transformer-visualizer}.html

Reported: 2026-04-27 · Fixed: 2026-04-27 · Credit: internal review

MathJax in the four Part XI applets was configured with only inlineMath: [['\\(','\\)']], which overrode MathJax 3’s defaults. All $...$ and $$...$$ markup in the applet prose rendered as literal dollar signs. Fixed by adding both inline-math delimiters ($...$ AND \\(...\\)) and both display-math delimiters ($$...$$ AND \\[...\\]) to the MathJax config.

Commit: 7c51e35

Contributors#

The first reader-reported entry will create this section.

If you spot a problem and email it in, your name (or your preferred attribution) will be recorded here. We do not publish email addresses, only the name or handle you specify.