Gauge Theory by Tim 04 Differential Forms

what's the definition of differential forms? k-forms are those that can integrate on k-dimensional domains or manifolds. Given what discussed about pull back of function and push forward, it's the same concept just expanded to higher dimensions: the purpose is to pull back differential forms by passing through wedges: Contraction operator? it's written as below … Continue reading Gauge Theory by Tim 04 Differential Forms

Gauge Theory by Tim 03 Pull Back and Push Forward

In this session, he talks about differential geometry starting from coordinate charts: to have this clear picture starting from a punched plane is very helpful in later variable coordinate etc. transformation. then to understand the concept of pull back and push forward. pull back means pull back of functions, or change of variables. To understand … Continue reading Gauge Theory by Tim 03 Pull Back and Push Forward

Gauge Theory by Timothy Nguyen 01

Local symmetries: Parallel transport: every location has a fiber, how to relate each fiber? that's what parallel transport do - relate fibers. note the word parallel generalize "constant"! While in calculus, we know derivative y dot or dy(t)/dt =0 means "constant". in gauge theory, parallel section or parallel function is that covariant derivative =0, or … Continue reading Gauge Theory by Timothy Nguyen 01

Maximum Likelihood Estimation (MLE)

Maximum Likelihood Estimation (MLE) is a robust statistical methodology employed to ascertain the parameters of a probability distribution through the maximization of the likelihood function. The significance of the likelihood function cannot be overstated, as it concurrently serves as the loss function within the framework of neural network mechanisms. Loss functions in neural networks can … Continue reading Maximum Likelihood Estimation (MLE)

Kolmogorov-Arnold Networks (KANs)

Kolmogorov-Arnold Networks (KANs) are a novel class of neural networks inspired by the Kolmogorov-Arnold Representation Theorem, which states that any multivariate function can be decomposed into a sum of univariate functions. KANs differ from traditional neural networks by replacing weighted sum-based neurons with learnable, non-linear univariate functions. This allows them to have adaptive basis functions, … Continue reading Kolmogorov-Arnold Networks (KANs)

Adaboost and Random Forest: Powerful Ensemble Methods

In the realm of data science, ensemble methods play a crucial role in improving predictive performance by combining multiple weak learners into a stronger model. One of the most well-known ensemble techniques is AdaBoost (Adaptive Boosting), introduced by Freund and Schapire in 1996. AdaBoost is a powerful yet intuitive algorithm that enhances the accuracy of … Continue reading Adaboost and Random Forest: Powerful Ensemble Methods