# In Math

If \(p\) is probability..

**odds**is \(\frac{p}{1-p}\).- The
**logit**(**log**istic un**it**) function or the**log-odds**is \(logit(p) = \log \frac{p}{1-p}\) in statistics.- Logit function makes a map of probability values from \((0, 1)\) to \((-\infty, +\infty)\).

- The
**logistic function**or the**sigmoid function**is the inverse-logit. (\(logistic(x) = logit^{-1}(x) = \frac{1}{1+e^{-x}}=\frac{e^{x}}{e^{x}+1}=p\)

# In Machine Learning

The **vector of raw (non-normalized) predictions** that a classification model generates, which is ordinarily then passed to a normalization function. Normalization function could be the **sigmoid function** in binary-class classification or **softmax function** in multi-class classification.