Skip to content

News & Insights

Graphic of a circular ring

L is for Logistic regression to LSTM


L is for Logistic regression to LSTM

From A to I to Z: Jaid’s Guide to Artificial Intelligence

Logistic regression and LSTM — Long Short-Term Memory — are binary classification techniques. 

Binary classification is a type of machine learning in which AI is trained to accurately predict the right outcome out of two possibilities, usually true or false, positive or negative, or 0 or 1. 

What’s logistic regression and how does it work?

In logistic regression AI is trained to use a mathematical equation to express in binary format a scenario’s most likely outcome based on a series of variables. 

Imagine you were queueing outside a club. 

An imposing security guard stands at the doorway, and he’s letting some people in and turning others away based on how old they look, what they’re wearing, and their demeanor.

In logistic regression:

  • AI would be the security guard
  • Age, attire, and demeanor would be the variables the AI uses to predict an outcome
  • Admittance or being turned away would be the two possible outcomes

Logistic regression is useful in scenarios where the relationship between the variables and the output is linear. Examples of such scenarios include fraud detection, credit scoring, and medical diagnosis.

It may not perform as well when the relationship between variables and outputs is more nuanced. 

What’s LSTM and how does it work?

LSTM is a recurrent neural network: a machine learning model designed to handle sequences, with a structure inspired by the human brain. Like the brain, it consists of a series of interconnected cells. Each cell processes information before sending it onto the next cell. 

Unlike logistic regression, LSTM makes predictions by evaluating sequences of data — a process that’s very similar to how you’d put a puzzle together. 

It starts by processing individual sequences. Over time, it joins these bits and pieces together to build a larger picture, and, finally, it determines which parts of the picture are and aren’t relevant, and uses the relevant bits to make a binary prediction. 

Where logistic regression is the hammer of binary classification, LSTM is the Swiss army knife. 

LSTM doesn’t assume that the input and output have a linear relationship, so it tends to perform better in nuanced scenarios such as predicting weather patterns, stock market movements, and speech recognition. 

Some facts:

While logistic regression and LSTM are both widely used in binary classification to this day, they have very different origins. 

English statistician David Cox introduced the idea of logistic regression in 1958, when he realized that the regression models popular at the time could produce outcomes outside 0 and 1 — outcomes that aren’t considered valid probabilities. 

By contrast, LSTM only became possible in 1997, once computing technology had caught up and could adequately support AI. 

Jürgen Schmidhuber, Sepp Hochreiter, and their team created LSTM in order to solve what is known as the vanishing gradient problem. At the time, neural networks’ gradients of error — this is a measure of how close an AI is to finding an optimal set of parameters — tended to decrease over time. And this made it difficult to learn long-term dependencies. 

LSTM solves this by being able to store information for longer periods and having gates that control the flow of information. In essence, they created the AI equivalent of what humans call long-term memory. 

Logistic regression has relatively low computing power requirements and is fairly easy to interpret, so it’s ideal for simple binary classification tasks. 

LSTM, on the other hand, requires significant resources and is expensive to train. The trade-off is that it can handle problems that are orders of magnitude more complex than logistic regression.

Want to know more?

This Machine Learning Mastery tutorial walks you through a typical binary classification problem step by step. 

Binary classification is just one type of classification in machine learning. The others are multi-class, multi-label, and imbalanced. This article walks you through all four types of classification: what they are, how they work, and key use cases.

Jaid’s perspective

Logistic regression and LSTM have a wide variety of use cases, all of which enhance AI and make it more powerful, accurate, and useful. In customer service, logistic regression and LSTM help AI improve its ability to predict sentiment, intent, and overall behavior, which allows it to deliver better service and a more satisfying experience with little to no human input.

Optimize your customer service experience by utilizing Jaid’s AI-powered platform – contact us today!