In short, cross-entropy(CE) is the measure of how far is your predicted value from the true label.
The cross here refers to calculating the entropy between two or more features / true labels (like 0, 1).
And the term entropy itself refers to randomness, so large value of it means your prediction is far off from real labels.
So the weights are changed to reduce CE and thus finally leads to reduced difference between the prediction and true labels and thus better accuracy.