backprop-2.md
Finishing what I left off: deriving Deep Learning without touching a single index
After 8 months since the last post on deriving backpropagation, I finally got some time to write the sequel. Well, during that 8 months, a lot has happened, but what matters is that I finally got some time to write this post.
In the last post, I defined derivatives on normed vector spaces, and used that to find the derivative of one of the most important functions: the Frobenius norm. The heart of our approach is using linearity to pass the derivative operator through the functions like the trace function to end up with a nice and clean expression for the derivative of complicated formulas.
combine.md
Solving a four-year-old problem: finding all combine functions for NRS
It was 2020. I was starting to switch from the “Cry = 10/10” rating system to a more systematic approach, as described in the title Anime Rating Rework. With this, a brand new system for anime ranking, with its own complicated rules are introduced. An anime’s score is consisted of several subscores, and the total score is just simply the weighted total of these subscores. No more “if this anime makes me cry, then it is an instant 10” (looking at you, Your Lie in April). The motivation for this is to buff the ongoing series Love Live! Nijigasaki Gakuen School Idol Doukoukai, an anime that completely changed my life (for the better or worse). In the old system, it did not give me much “emotional impact”, so it performs not as well as the top anime. And because of that, taking the excuse of “modernizing the anime ranking system”, people decided to make a rework out of nowhere, giving birth to the first rework system.
cxx-modules.md
As everyone who clicked on this post should know by now, C++ modules have been a
thing since C++20. Oh, 2020, it’s been a while, huh. Back then, C++ modules to
me were nothing but a distant dream. We need to wait for major compilers to
adopt this new thing, CMake to support it, and LSPs like clangd
to handle them
properly. And looking at how some people are still writing C++ like ‘C with
classes’, certainly this won’t come soon, right?
backprop-1.md
My earlier attempts with Machine Learning
When I started programming, I saw all sort of videos about training neural networks to play games. Y’know, those “AI learns to play name of game” videos that use some black-box optimization techniques like Genetic Algorithm (GA) to optimize the network and gradually gets better at playing that game. The most iconic video was probably SethBling’s MarI/O - Machine Learning for Video Games (which, not gonna lie, was much simpler than what I recalled). This technique originates from an old paper, Evolving Neural Networks through Augmenting Topologies, which I have just seen right now after checking out the SethBling video’s description. The main idea is to prepare some neural network topology, and the standard procedure of GA is used to furthermore optimize this model via tweaking its parameters.
first-post.md
So I guess I’ll post stuff that I’m interetested in here.
Hi, I guess.
Also I probably won’t post much because I seldom do anything that interesting. (I wish I could work on things that fascinated me, but idk the economy is kind of in a bad state right now.)
Here is some test LaTeX (inline and block): \( L_o(\mathbf{x}, \omega_o, \lambda, t) = L_e(\mathbf{x}, \omega_o, \lambda, t) + L_r(\textbf{x}, \omega_o, \lambda, t) \),