The derivative $f'(x)$ is correlated with $f(x)$ in a certain sense. In fact, $f'(x)$ is a function of $f$, so we could even say that there's a cause-effect relationship.
The derivative at a specific point $c$ of the domain, i.e. $f'(c)$, can either be negative or positive. If $f'(c) > 0$, then $f(c)$ is increasing (with respect to an increase of $x$). If $f'(c) < 0$, then $f(c)$ decreasing (with respect to an increase of $x$).
This can easily be seen from an example. Consider $f(x) = x^2$, then $f'(x) = 2x$. Let $c = 2$, then $f'(2) = 4$, so the function is increasing. In fact, $f(1) = 2 \leq f(2) = 4 \leq f(3) = 9$. Similarly, let $c = -1$, then $f'(-1) = -2$, so the function is decreasing. In fact, $\leq f(-2) = 4 \geq f(-1) = 1 \geq f(0) = 0$ (note that the function is decreasing as we increase $x$!).
Consider a model with only one parameter, then the partial derivative of the loss function with respect to that parameter corresponds to the derivative of the loss function. So, the reasoning above applies to this model. What about a model with more than one parameter? The same thing happens.
If the function decreases, does its derivative also decrease? In general, no, and this can easily be seen from a plot of a function and its derivative. For example, consider a plot of a parabola and its derivative (which is a linear function).
On the left of the y-axis, the parabola is decreasing, but its derivative is increasing, while, on the right of the y-axis, the parabola is increasing and the linear function is still increasing.
This is the same thing with a loss function of an ML model and its partial derivative.