$$ J(\theta) = - \frac{1}{m} [ \sum_{i=1}^{m} y^i log( h_\theta(x^i) ) + (1-y^i) log( 1 - h_\theta(x^i))] $$
This is the equation of the cost function for logistic regression. To apply gradient descent to it, I need to calculate partial derivatives with respect to $\theta_j$.
How do I calculate $\frac{\partial}{\partial\theta_j} J(\theta)$ ?
Edit: I am thinking it'll be along the lines of product rule of derivatives but I am unable to work it out.