This approach assumes that the sample points are on a smooth manifold, a low-dimensional representation of the computational data for this approach, where the local nearest-neighbor information is optimally preserved. In this way, a solution is obtained that reflects the geometric structure of the manifold.
Step one:Construct a graph G=(V,E), where V={vi, i=1,2,3...n} is the set of vertices and E={eij} is the vi and vj edges connecting the vertices.Each node vi of the graph is associated with a point xi in the sample set X. We can use the graph as an example. We connect vi, vj if xi, xj are close to each other. i.e., we insert an edge eij at the respective node if Xj is in the domain k of xi, k being the defining parameter.
Step two:Each edge corresponds to a weight, Wij, with 0 weights between connecting points that are not connected:
Step Three:honorific title , realizing the generalized eigenvalue decomposition:
envoy is the smallest m + 1 eigenvalues. Neglecting the relationship with = 0 related eigenvectors, and the other m eigenvectors are selected as the dimensionality reduced vectors.
1, python implementation of Laplace dimensionality reduction
def laplaEigen(dataMat,k,t): m,n=shape(dataMat) W=mat(zeros([m,m])) D=mat(zeros([m,m])) for i in range(m): k_index=knn(dataMat[i,:],dataMat,k) for j in range(k): sqDiffVector = dataMat[i,:]-dataMat[k_index[j],:] sqDiffVector=array(sqDiffVector)**2 sqDistances = () W[i,k_index[j]]=(-sqDistances/t) D[i,i]+=W[i,k_index[j]] L=D-W Dinv=(D) X=(,L) lamda,f=(X) return lamda,f def knn(inX, dataSet, k): dataSetSize = [0] diffMat = tile(inX, (dataSetSize,1)) - dataSet sqDiffMat = array(diffMat)**2 sqDistances = (axis=1) distances = sqDistances**0.5 sortedDistIndicies = () return sortedDistIndicies[0:k] dataMat, color = make_swiss_roll(n_samples=2000) lamda,f=laplaEigen(dataMat,11,5.0) fm,fn =shape(f) print 'fm,fn:',fm,fn lamdaIndicies = argsort(lamda) first=0 second=0 print lamdaIndicies[0], lamdaIndicies[1] for i in range(fm): if lamda[lamdaIndicies[i]].real>1e-5: print lamda[lamdaIndicies[i]] first=lamdaIndicies[i] second=lamdaIndicies[i+1] break print first, second redEigVects = f[:,lamdaIndicies] fig=('origin') ax1 = fig.add_subplot(111, projection='3d') (dataMat[:, 0], dataMat[:, 1], dataMat[:, 2], c=color,cmap=) fig=('lowdata') ax2 = fig.add_subplot(111) (f[:,first], f[:,second], c=color, cmap=) ()
2. Laplace's dimensionality reduction experiment
The following parameters are used to generate the experimental data to be stored inside:
def make_swiss_roll(n_samples=100, noise=0.0, random_state=None): #Generate a swiss roll dataset. t = 1.5 * * (1 + 2 * (1, n_samples)) x = t * (t) y = 83 * (1, n_samples) z = t * (t) X = ((x, y, z)) X += noise * (3, n_samples) X = t = (t) return X, t
The results of the experiment are as follows:
The above example of this python implementation of Laplace feature map dimensionality reduction is all that I have shared with you.