The perceptron gives scores that only mean that one instance is more negative/positive than another. No notion of certainty/probability.
The sigmoid function $\sigma$ "squishes" any real number between 0 and 1:
Logistic regression: $P(y = 1) = \sigma(\mathbf{w} \cdot \mathbf{x})$
Not all classifications we need are binary:
The binary percepton cannot do it...
Many of them together though can!