NapkinML 安装实现 介绍
NapkinML 是 NumPy 中机器学习模型的袖珍实现。
安装
$ git clone https://github.com/eriklindernoren/NapkinML $ cd NapkinML $ sudo python setup.py install
实现
class KNN(): def predict(self, k, Xt, X, y): y_pred = np.empty(len(Xt)) for i, xt in enumerate(Xt): idx = np.argsort([np.linalg.norm(x-xt) for x in X])[:k] y_pred[i] = np.bincount([y[i] for i in idx]).argmax() return y_pred $ python napkin_ml/examples/knn.py
图:使用K-Nearest最近邻的Iris数据集的分类。
class LinearRegression(): def fit(self, X, y): self.w = np.linalg.lstsq(X, y, rcond=None)[0] def predict(self, X): return X.dot(self.w) $ python napkin_ml/examples/linear_regression.py
图:线性回归
class LDA(): def fit(self, X, y): cov_sum = sum([np.cov(X[y == val], rowvar=False) for val in [0, 1]]) mean_diff = X[y == 0].mean(0) - X[y == 1].mean(0) self.w = np.linalg.inv(cov_sum).dot(mean_diff) def predict(self, X): return 1 * (X.dot(self.w) < 0) $ python napkin_ml/examples/lda.py
class LogisticRegression(): def fit(self, X, y, n_iter=4000, lr=0.01): self.w = np.random.rand(X.shape[1]) for _ in range(n_iter): self.w -= lr * (self.predict(X) - y).dot(X) def predict(self, X): return sigmoid(X.dot(self.w)) $ python napkin_ml/examples/logistic_regression.py
图:Logistic回归分类
class MLP(): def fit(self, X, y, n_epochs=4000, lr=0.01, n_units=10): self.w = np.random.rand(X.shape[1], n_units) self.v = np.random.rand(n_units, y.shape[1]) for _ in range(n_epochs): h_out = sigmoid(X.dot(self.w)) out = softmax(h_out.dot(self.v)) self.v -= lr * h_out.T.dot(out - y) self.w -= lr * X.T.dot((out - y).dot(self.v.T) * (h_out * (1 - h_out))) def predict(self, X): return softmax(sigmoid(X.dot(self.w)).dot(self.v)) $ python napkin_ml/examples/mlp.py
class PCA(): def transform(self, X, dim): _, S, V = np.linalg.svd(X - X.mean(0), full_matrices=True) idx = S.argsort()[::-1] V = V[idx][:dim] return X.dot(V.T) $ python napkin_ml/examples/pca.py
图:主成分分析降维。
NapkinML 安装实现 官网
https://github.com/eriklindernoren/NapkinML
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。