Posts by Tags

MDPs

Notes on Interval Matrix Theory

less than 1 minute read

Published:

Interval matrix theory is a subfield of linear algebra concerned with obtaining solutions to systems of equations $Ax = b$, where the components $A, b$ take values in an interval.

Notes on Modeling POMDPs: Crites Elevator

1 minute read

Published:

In his PhD thesis, Crites & Barto (1998) analyzed the problem of controlling multiple different elevators and modeled the system as a multi-agent POMDP, noting both the extreme size of the state space — with 4 elevators in a 10-story building, on the order of $10^{22}$ states — as well as the effectiveness of deep-$Q$-learning techniques in attacking the problem.

Monte Carlo

POMDPs

Notes on Interval Matrix Theory

less than 1 minute read

Published:

Interval matrix theory is a subfield of linear algebra concerned with obtaining solutions to systems of equations $Ax = b$, where the components $A, b$ take values in an interval.

Notes on Modeling POMDPs: Crites Elevator

1 minute read

Published:

In his PhD thesis, Crites & Barto (1998) analyzed the problem of controlling multiple different elevators and modeled the system as a multi-agent POMDP, noting both the extreme size of the state space — with 4 elevators in a 10-story building, on the order of $10^{22}$ states — as well as the effectiveness of deep-$Q$-learning techniques in attacking the problem.

algebraic statistics

IMSI Algebraic Statistics Note #1

11 minute read

Published:

Spent the week at a workshop at the University of Chicago, “Invitation to Algebraic Statistics,” part of the IMSI long program “Algebraic Statistics and our Changing World”. This week’s workshop featured excellent talks on the application of algebraic statistics to estimation & optimization problems, graphical models, neural networks, algebraic economics, and ecological problems.

baseball

Modeling Players’ Chances of Entering the BBHOF

18 minute read

Published:

This report describes an attempt to model using different statistical techniques the selection process for the Baseball Hall of Fame (HOF). We describe the history and election procedures of the Hall of Fame, and also the appropriateness of kernel estimation and machine learning methods to the problem of predicting which players will make it in.

bayesian modeling

Modeling a Prediction Game

9 minute read

Published:

Game Setup

At the start of this year, a few friends and I got together to play a predictions game. We all filled out 5x5 bingo cards with predictions that had to come due in 2024 on any topic in the world, with eternal honor going to the one who got a bingo. Here’s what my card looked like to start:

benign overfitting

Summer Note #1

4 minute read

Published:

Back into the swing of things: attended two talks this week, and prepping for the IMSI Workshop to begin next Monday.

card games

classification

Modeling Players’ Chances of Entering the BBHOF

18 minute read

Published:

This report describes an attempt to model using different statistical techniques the selection process for the Baseball Hall of Fame (HOF). We describe the history and election procedures of the Hall of Fame, and also the appropriateness of kernel estimation and machine learning methods to the problem of predicting which players will make it in.

compute

deep learning

g-rips 2023

G-RIPS Research Musing #4

4 minute read

Published:

Stalled on Gaussian process techniques this week. Ultimately hard to find out what precisely the GP-regression could achieve – if we already have an aligned shape we can of course calculate its deformation field, and then we can transform the reference shape into the target shape. That might suggest that GP-regression should be able to go from an arbitrary shape to an aligned shape, which may indeed be the case. I do have some confusion about what the actual train and test of a GP-regression should be – ultimately I suppose it will have to output some coefficients for creating a weighted linear combination of the eigenbasis which is derived from the eigenfunctions of the GP kernel.

G-RIPS Research Musing #3

1 minute read

Published:

Beginning work on a miniproject based on reading the research of Marcel LĂĽthi and Thomas Gerig on Gaussian process morphable models. Their GPMMs are an expansion of the classical point distribution models. Given a set of shapes in space ${\Gamma_1,\dots,\Gamma_n}$, they set one shape as the reference shape and the consider the deformation fields $u_i$, which are vector fields from $\mathbb{R^3} \rightarrow \mathbb{R^3}$ such that for any shape $\Gamma_i$ there exists a deformation $u_i$ such that $\Gamma_i(x) = {\bar{x}+u(x)|}$.

G-RIPS Research Musing #2

3 minute read

Published:

Of late, have read much more about statistical shape models: their origins and the various ways people have implemented them, their connections to long-running computer vision research, and state of the art work with neural networks.

geometric deep learning

Summer Note #1

4 minute read

Published:

Back into the swing of things: attended two talks this week, and prepping for the IMSI Workshop to begin next Monday.

interval matrix theory

Notes on Interval Matrix Theory

less than 1 minute read

Published:

Interval matrix theory is a subfield of linear algebra concerned with obtaining solutions to systems of equations $Ax = b$, where the components $A, b$ take values in an interval.

linear algebra

Notes on Interval Matrix Theory

less than 1 minute read

Published:

Interval matrix theory is a subfield of linear algebra concerned with obtaining solutions to systems of equations $Ax = b$, where the components $A, b$ take values in an interval.

linear models

Modeling Players’ Chances of Entering the BBHOF

18 minute read

Published:

This report describes an attempt to model using different statistical techniques the selection process for the Baseball Hall of Fame (HOF). We describe the history and election procedures of the Hall of Fame, and also the appropriateness of kernel estimation and machine learning methods to the problem of predicting which players will make it in.

monte carlo

neural networks

IMSI Algebraic Statistics Note #1

11 minute read

Published:

Spent the week at a workshop at the University of Chicago, “Invitation to Algebraic Statistics,” part of the IMSI long program “Algebraic Statistics and our Changing World”. This week’s workshop featured excellent talks on the application of algebraic statistics to estimation & optimization problems, graphical models, neural networks, algebraic economics, and ecological problems.

Summer Note #1

4 minute read

Published:

Back into the swing of things: attended two talks this week, and prepping for the IMSI Workshop to begin next Monday.

G-RIPS Research Musing #4

4 minute read

Published:

Stalled on Gaussian process techniques this week. Ultimately hard to find out what precisely the GP-regression could achieve – if we already have an aligned shape we can of course calculate its deformation field, and then we can transform the reference shape into the target shape. That might suggest that GP-regression should be able to go from an arbitrary shape to an aligned shape, which may indeed be the case. I do have some confusion about what the actual train and test of a GP-regression should be – ultimately I suppose it will have to output some coefficients for creating a weighted linear combination of the eigenbasis which is derived from the eigenfunctions of the GP kernel.

G-RIPS Research Musing #3

1 minute read

Published:

Beginning work on a miniproject based on reading the research of Marcel LĂĽthi and Thomas Gerig on Gaussian process morphable models. Their GPMMs are an expansion of the classical point distribution models. Given a set of shapes in space ${\Gamma_1,\dots,\Gamma_n}$, they set one shape as the reference shape and the consider the deformation fields $u_i$, which are vector fields from $\mathbb{R^3} \rightarrow \mathbb{R^3}$ such that for any shape $\Gamma_i$ there exists a deformation $u_i$ such that $\Gamma_i(x) = {\bar{x}+u(x)|}$.

G-RIPS Research Musing #2

3 minute read

Published:

Of late, have read much more about statistical shape models: their origins and the various ways people have implemented them, their connections to long-running computer vision research, and state of the art work with neural networks.

probability

random forests

Modeling Players’ Chances of Entering the BBHOF

18 minute read

Published:

This report describes an attempt to model using different statistical techniques the selection process for the Baseball Hall of Fame (HOF). We describe the history and election procedures of the Hall of Fame, and also the appropriateness of kernel estimation and machine learning methods to the problem of predicting which players will make it in.

reinforcement learning

Notes on Modeling POMDPs: Crites Elevator

1 minute read

Published:

In his PhD thesis, Crites & Barto (1998) analyzed the problem of controlling multiple different elevators and modeled the system as a multi-agent POMDP, noting both the extreme size of the state space — with 4 elevators in a 10-story building, on the order of $10^{22}$ states — as well as the effectiveness of deep-$Q$-learning techniques in attacking the problem.

sample complexity

scaling laws

sports data

statistical shape modeling

G-RIPS Research Musing #4

4 minute read

Published:

Stalled on Gaussian process techniques this week. Ultimately hard to find out what precisely the GP-regression could achieve – if we already have an aligned shape we can of course calculate its deformation field, and then we can transform the reference shape into the target shape. That might suggest that GP-regression should be able to go from an arbitrary shape to an aligned shape, which may indeed be the case. I do have some confusion about what the actual train and test of a GP-regression should be – ultimately I suppose it will have to output some coefficients for creating a weighted linear combination of the eigenbasis which is derived from the eigenfunctions of the GP kernel.

G-RIPS Research Musing #3

1 minute read

Published:

Beginning work on a miniproject based on reading the research of Marcel LĂĽthi and Thomas Gerig on Gaussian process morphable models. Their GPMMs are an expansion of the classical point distribution models. Given a set of shapes in space ${\Gamma_1,\dots,\Gamma_n}$, they set one shape as the reference shape and the consider the deformation fields $u_i$, which are vector fields from $\mathbb{R^3} \rightarrow \mathbb{R^3}$ such that for any shape $\Gamma_i$ there exists a deformation $u_i$ such that $\Gamma_i(x) = {\bar{x}+u(x)|}$.

G-RIPS Research Musing #2

3 minute read

Published:

Of late, have read much more about statistical shape models: their origins and the various ways people have implemented them, their connections to long-running computer vision research, and state of the art work with neural networks.

value function

Notes on Modeling POMDPs: Crites Elevator

1 minute read

Published:

In his PhD thesis, Crites & Barto (1998) analyzed the problem of controlling multiple different elevators and modeled the system as a multi-agent POMDP, noting both the extreme size of the state space — with 4 elevators in a 10-story building, on the order of $10^{22}$ states — as well as the effectiveness of deep-$Q$-learning techniques in attacking the problem.