Both are probabilistic
Logistics
Logistics
- Discriminative (Entire approach is purely discriminative)
- P(Y/X)
- Final Value lies between Zero and 1
- Formula given by exp(w0+w1x)/(exp(w0+ w1x)+1)
- Further can be expressed as 1/(1+(exp-(w0+ w1x))
Binary Logistic Regression - 2 class
Multinomial Logistic Regression - More than 2 class
Example - Link
Link - Ref
Logistic Regression
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import math | |
def sigmoid(x): | |
a = [] | |
for item in x: | |
a.append(1/(1+math.exp(-item))) | |
return a | |
import matplotlib.pyplot as plt | |
import numpy as np | |
x = np.arange(-70, 70,1) | |
sig = sigmoid(x) | |
plt.plot(x,sig) | |
plt.title('Sigmoid Weight 1') | |
plt.show() | |
x = np.arange(-70, 70,5) | |
sig = sigmoid(x) | |
plt.title('Sigmoid Weight 5') | |
plt.plot(x,sig) | |
plt.show() | |
x = np.arange(-70,70,100) | |
sig = sigmoid(x) | |
plt.title('Sigmoid Weight 100') | |
plt.plot(x,sig) | |
plt.show() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#3 class classifier | |
import numpy as np | |
import matplotlib.pyplot as plt | |
from sklearn import linear_model, datasets | |
iris = datasets.load_iris() | |
#only two features taken | |
X = iris.data[:,:2] | |
Y = iris.target | |
#step size in mesh | |
h = 0.02 | |
logreg = linear_model.LogisticRegression(C=1e5) | |
#create instance of classifier and fit data | |
logreg.fit(X,Y) | |
#plot decision boundary and assign color for it | |
x_min, x_max = X[:,0].min()-0.5,X[:,0].max()+0.5 | |
y_min, y_max = X[:,1].min()-0.5,X[:,1].max()+0.5 | |
xx,yy = np.meshgrid(np.arange(x_min,x_max,h),np.arange(y_min,y_max,h)) | |
Z = logreg.predict(np.c_[xx.ravel(),yy.ravel()]) | |
#put the result to color plot | |
Z = Z.reshape(xx.shape) | |
plt.figure(1,figsize=(4,3)) | |
plt.pcolormesh(xx,yy,Z,cmap=plt.cm.Paired) | |
#plot also training points | |
plt.scatter(X[:,0],X[:,1],c=Y,edgecolors='k',cmap=plt.cm.Paired) | |
plt.xlabel('Sepal Length') | |
plt.ylabel('Sepal width') | |
plt.xlim(xx.min(),xx.max()) | |
plt.ylim(yy.min(),yy.max()) | |
plt.xticks(()) | |
plt.yticks(()) | |
plt.show() |
Link - Ref
Logistic Regression
- Classification Model
- Probability of success as a sigmoid function of a linear combination of features
- y belongs to (0,1) - 2 Class problem
- p(yi) = 1 / 1+e-(w1x1+w2x2)
- Linear combination of features - w1x1+w2x2
- w can be found with max likelihood estimate-
Naive Bayes
- Generative Model
- P(X/ Given Y) is Naive Bayes Assumption
- Distribution for each class
No comments:
Post a Comment