CS330 lecture 1&2 notes
Informal Problem Definitions

The multitask learning problem: Learn all of the tasks more quickly, tasks, and fast and more professional learning all tasks. 
The metalearning problem: Given data/experience on previous tasks, learn a new, learn, and the new task is faster and better.
MultiTask Learning Basics
What's a task?

Mixed discrete, continuous labels across tasks (mixed discrete and continuous label tasks) 
Care more about one task than another (different weights between tasks?)
Conditioning on the task
Common Choices

Additive conditioning (addition)

Multihead architecture

Multiplicative conditioning
Complex Choices
Optimizing the objective

Sampling a minibatch from tasks (\mathscr{B} \sim\left\{\mathscr{T}_{i}\right\}\) 
Sample a minibatch data sample from each task. (\mathscr{D}_{i}^{b} \sim \mathscr{D}_{i}\) 
Calculate loss on each minibatchtask: (\hat{\mathscr{L}} (\theta, \mathscr{B}) =\sum_{\mathcal{T}_{k} \in \mathscr{B}} \mathscr{L}_{k}\left (\theta, \mathscr{D}_{k}^{b}\right)) 
Back propagation computation gradient (\nabla_{\theta} \hat{\mathscr{L}}\) 
Update gradient with your favorite optimizer
Challenge

Negative transfer

Overfitting
MetaLearning Basics
Problem definitions