A Simple Perceptron to Understand Neural Network

Let's consider our familiar workflow of portfolio weighting and capping. Have you ever thought about the essence of weight capping? When we cap a position at 10% and redistribute the excess weight proportionally to the rest of the portfolio, what are we really doing? In essence, we're optimizing the portfolio weights by minimizing a weight … Continue reading A Simple Perceptron to Understand Neural Network

Gauge Theory by Tim 04 Differential Forms

what's the definition of differential forms? k-forms are those that can integrate on k-dimensional domains or manifolds. Given what discussed about pull back of function and push forward, it's the same concept just expanded to higher dimensions: the purpose is to pull back differential forms by passing through wedges: Contraction operator? it's written as below … Continue reading Gauge Theory by Tim 04 Differential Forms

Gauge Theory by Tim 03 Pull Back and Push Forward

In this session, he talks about differential geometry starting from coordinate charts: to have this clear picture starting from a punched plane is very helpful in later variable coordinate etc. transformation. then to understand the concept of pull back and push forward. pull back means pull back of functions, or change of variables. To understand … Continue reading Gauge Theory by Tim 03 Pull Back and Push Forward

Gauge Theory by Timothy Nguyen 01

Local symmetries: Parallel transport: every location has a fiber, how to relate each fiber? that's what parallel transport do - relate fibers. note the word parallel generalize "constant"! While in calculus, we know derivative y dot or dy(t)/dt =0 means "constant". in gauge theory, parallel section or parallel function is that covariant derivative =0, or … Continue reading Gauge Theory by Timothy Nguyen 01