"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

April 22, 2016

Day #17 - Python Basics

#Example #1 - numpy version
import numpy as np
np.version.version
#Shift+Enter Command
#Example #2 - Vector of 10 null elements
import numpy as np
x = np.zeros(10)
x
#Example #3 - one element not null
import numpy as np
x = np.zeros(10)
x[4] = 1
x
#Example #4 - vector of range of values
import numpy as np
x = np.arange(10,49,1)
x
#Example #5 - Reverse Vector
import numpy as np
x = np.arange(10,49,1)
xrev = x[::-1]
xrev
#Example #6 - 3 x 3 matrix
#[0..8]
import numpy as np
a = np.matrix('0,1,2;3,4,5;6,7,8')
a
#Example #7 - Indices of non zero elements
import numpy as np
a = [1,2,0,0,4,0]
aval = np.nonzero(a)[0]
print(aval)
#Example #8 - 3 x 3 Identity matrix
import numpy as np
a = np.eye(3, dtype=int)
a
#Example #9 - 3 x 3 x 3 array of random values
import random
import numpy as np
n = 3
a = np.random.random((n,n,n))
a
#Also found this link are doing the exercise - http://www.labri.fr/perso/nrougier/teaching/numpy.100/
import matplotlib.pyplot as plt
#Declare Dictionary
result = {}
#Populate Data
for i in range(1,100,1):
result[i] = (i,i*9)
#Approach #1
print 'Approach #1'
for i in range(1,100,1):
(key,value) = result[i]
print(key)
print(value)
#Approach #2
print 'Approach #2'
for key, value in result.iteritems():
print key
print value[1]
#Plot the line
def plotgraph(result):
for key, value in result.iteritems():
print key
print value[1]
plt.plot(key, value[1],'-o')
plt.xlabel("x Values")
plt.ylabel("Y Value")
plt.title("Plot Graph")
plt.show()
plotgraph(result)
Happy Learning!!!

Neural Networks Basics


Notes from Session
  • Neurons - Synapses. Model brain at high level
  • Machine Learning  - Algorithms for classification and prediction
  • Mimic brain structure in technology
  • Recommender engines use neural networks
  • With more data we can increase accuracy of models
  • Linear Regression, y = mx + b. Fit data set with little error possible.
Neural Network
  • Equation starts from neuron
  • Multiply weights to inputs (Weights are coefficients)
  • Apply activation function (Depends on problem being solved)
Basic Structure
  • Input Layer
  • Hidden Layer (Multiple hidden layers) - Computation done @ hidden layer
  • Output Layer
  • Supervised learning (Train & Test)
  • Loss function determines how error looks like
  • Deep Learning - Automatic Feature Detection


Happy Learning!!!

April 14, 2016

Basics - SUPPORT VECTOR MACHINES

Good Reading from link

Key Notes
  • Allow non-linear decision boundaries
  • SVM - Out of box supervised learning technique
  • Feature Space - Finite dimensional vector space
  • Each dimension represents feature
  • Goal of SVN - Train a model that assigns unseen objects into particular category
  • Creates linear partition of feature space
  • Based on features it places above or below separation linear
  • No stochastic element involved (No involvement of any previous state status)
  • support vector classifiers or soft margin classifiers - allows some observations to be on in-correct side of hyperplane allowing soft margin
Advantage
  • High Dimensionality, Memory Efficiency, Versatility
Disadvantages
  • Non probabilistic
More Reads

Happy Learning!!!

Day #16 - Python Basics

#Try from link https://try.jupyter.org/
#Ref Tutorial - https://www.youtube.com/watch?v=1I2Bz0qbMsc
#Example #1 - python variables
#boolean
a = True
#int
b = 4
#float
c = 3.45
#complex
e = 7j
print(c)
print(e)
print(a)
print(b)
#Example 2
name = "Siva"
print(name)
agelist=[1,2,3,4,5]
print(agelist)
agelist.append(10)
print(agelist)
dict_example={"name":"siva","credit":100}
print(dict_example["name"])
dict_example.values()
#Example 3 - Integer Array
import numpy as np
a = np.arange(1,10)
a
#Example 4 - Integer Array
import numpy as np
ds = np.arange(1.2,10,2,dtype=np.float64)
ds
#Example 5 shape of array
import numpy as np
ds = np.arange(1.2,10,2,dtype=np.float64)
ds.shape
ds
#Example 6 Parsing list
agelist=[1,2,3,4,5,6,7,8,9,10]
#print first element
print(agelist[0])
#print from 0 to 2
print(agelist[0:2])
#print from 2 onwards
print(agelist[2:])
#print last element
print(agelist[-1])
#print last two elements
print(agelist[-2:])
#print start to last-1
print(agelist[0:-1])
#Example 7
a = 10
b = 3
#Float result
print(a/b)
#Int result
print(a//b)
#Example 8
#For loop
for counter in range(1,50,1):
print(counter)
#Example 9
#While Loop
counter=1
while counter<100:
print(counter)
counter=counter+1
#Example 10
#If conditions
a=9
if a>10:
print("a > 10")
elif a<10:
print("a <10")
else:
print("a=10")
#Example 11 Functions
def computesquare(n):
return n*n
print(computesquare(10))
view raw Pythonbasics.py hosted with ❤ by GitHub
Happy Learning!!!

April 10, 2016

Probability Tips

  • Discrete random variables are things we count
  • A discrete variable is a variable which can only take a countable number of values
  • Probability mass function (pmf) is a function that gives the probability that a discrete random variable is exactly equal to some value.
  • Continuous random variables are things we measure
  • A continuous random variable is a random variable where the data can take infinitely many values.
  • Probability density function (PDF), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value
  • Bernoulli process is a finite or infinite sequence of binary random variables
  • Markov Chain - stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event
Let's Continue Learning!!!

Day #15 - Data Science - Maths Basics

Day #15 - Mathematics Basics

Sets Basics
  • Cardinality - Number of distinct elements in Set (For a Finite Set)
  • For Real numbers cardinality infinite
Rational Numbers - Made of Ratio of two numbers
Fibonacci series was introduced in 1201 - Amazing :)

Functions
  • Represents relationship between mathematical variables
  • Spread of all possible output is called range
  • Function that maps from A to B. A is referred as (Domain), B is referred as co-domain
Matrix
  • Rows and columns define matrix 
  • 2D array of numbers 
  • Eigen Values - Scalars, Eigen Vector - Vectors special set of values associated with Matrix M
  • Eigen Vectors - Those directions remain unchanged by action of matrix M
  • Trace - Sum of diagonal elements
  • Rank of Matrix - Number of linearly independent vectors
Determinant
  • Can be computed only for square matrix
Vector
  • Vectors have magnitude, length and direction
  •  Magnitude and cost of angle will give you direction
  • Vector product non-commutative
  •  Dot product commutative
  •  Vector is linearly independent if none of vectors can be written as sum of multiple of other vectors



 Happy Learning!!!

April 09, 2016

Day #14 - R Working Tips

#Tip #1 - R - Case Sensitive
#Tip #2 - R - Data Stored in memory
#Tip #3 - Assigning variables
var2 <-3
var2
var2 <-"Test"
var2
#Tip #4 - Working with Dates
#Convert character to date using as.Date function
as.Date('2016-04-16')
#Tip #5 - Function to remove non available values (Missing Data)
na.rm = TRUE
#Test for Missing Data
is.na(VariableName)
#Tip #6
#Vectors - Basic Data Structures One Dimensional Arrays
#Created using c command
a_vector <- c(1,2,3,4,5)
a_vector
#Index starts with 1
a_vector[1]
b_vector <- c("Apple","Orange","Banana")
b_vector
b_vector[2]
#Negate - Ignore particular value, Return everything except 1
b_vector[-1]
#Tip #7 - Vector Operations
v1 <- c(2,3,4)
v2 <- c(10,11,12)
v1+v2
#Tip #8 - Lists
#Can hold different data types
emp_details_1 <- list(01,'Siva','India')
emp_details_1
emp_details_2 <- list(02,'TOM','USA')
emp_details_2
#Tip #9 - Data Frame (Store Rows and Columns)
#Row bind
df1 = rbind(emp_details_1,emp_details_2)
df1
#Column bind
df2 = cbind(v1,v2)
df2
#Tip #10 - Empty vector and append elements
a <- vector(mode="numeric", length=0)
b <- vector(mode="numeric", length=0)
a <- c(a,1)
b <- c(b,2)
a
b
#Tip #11 - Merge function - merge based on particular column
#Tip #12 - Factors
# R automatically recognizes
#as.factor converts into factor
#classification uses factors
gender <- c("Male","Female")
gender_cf <- as.factor(gender)
gender_cf
#Matrix Basics
xx <- matrix(c(0.4, 0.6, 0.3, 0.7))
xx
xx <- matrix(c(0.4, 0.6, 0.3, 0.7), nrow=2)
xx
xx <- matrix(c(0.4, 0.6, 0.3, 0.7), nrow=2)
xx <- t(xx)
xx
xx %*% xx
view raw RBasics.R hosted with ❤ by GitHub
Happy Learning!!!

April 07, 2016

Day #13 - Maths and Data Science

  • Recommender Systems - Pure matrix decomposition problem
  • Deep Learning - Matrix Calculus
  • Google Search - Page Rank, Social Media Graph Analysis - Eigen Decomposition
Happy Learning!!!

April 04, 2016

Ensemble



  • Combine many predictors and provide weighted average
  • Use single kind of learner but multiple instances
  • Collection of "Ok" predictors and combine them making them powerful
  • Learn Predictors and combine them using another new model 
  • One layer of predictors providing features for next layer 
Happy Learning!!!