User's Guide to AI

Stochastic gradient descent (SGD)

Machine Learning

Stochastic Gradient Descent (SGD) is an iterative method for optimizing an objective function, typically used to train machine learning models. It updates parameters of the model by moving in the direction of the steepest descent, as defined by the negative of the gradient. Unlike traditional gradient descent, which uses the entire data set to compute the gradient, SGD randomly selects a subset of data at each step. This makes SGD faster and more scalable, especially for large datasets.

Descriptive Alt Text

User's Guide to AI

Understanding LLMs, image generation, prompting and more.

© 2024 User's Guide to AI

[email protected]

Our Mission

Advance your understanding of AI with cutting-edge insights, tools, and expert tips.