The basic idea of Gradient Descent is to traverse the surface of the loss function in the direction of the negative gradient to reach the local minimum.
This short video, explains what the variations Batch Gradient Descent, Mini-Batch Gradient Descent and Stochastic Gradient Descent are, the main differences between them and when they are used.