Examples

  1. Compute a distance between two graphs.

  2. Compute a distance between two graphs.

  3. Stochastic Gradient Descent applied to linear regression

  4. Stochastic Gradient Descent applied to linear regression

Compute a distance between two graphs.

See Distance between two graphs.

<<<

import copy
from mlstatpy.graph import GraphDistance

# We define two graphs as list of edges.
graph1 = [
    ("a", "b"),
    ("b", "c"),
    ("b", "X"),
    ("X", "c"),
    ("c", "d"),
    ("d", "e"),
    ("0", "b"),
]
graph2 = [
    ("a", "b"),
    ("b", "c"),
    ("b", "X"),
    ("X", "c"),
    ("c", "t"),
    ("t", "d"),
    ("d", "e"),
    ("d", "g"),
]

# We convert them into objects GraphDistance.
graph1 = GraphDistance(graph1)
graph2 = GraphDistance(graph2)

distance, graph = graph1.distance_matching_graphs_paths(graph2, use_min=False)

print("distance", distance)
print("common paths:", graph)

>>>

    distance 0.3318250377073907
    common paths: 0
    X
    a
    b
    c
    d
    e
    00
    11
    g
    t
    a -> b []
    b -> c []
    b -> X []
    X -> c []
    c -> d []
    d -> e []
    0 -> b []
    00 -> a []
    00 -> 0 []
    e -> 11 []
    c -> 2a.t []
    2a.t -> d []
    d -> 2a.g []
    2a.g -> 11 []

(entrée originale : graph_distance.py:docstring of mlstatpy.graph.graph_distance.GraphDistance, line 3)

Compute a distance between two graphs.

See Distance between two graphs.

<<<

import copy
from mlstatpy.graph import GraphDistance

# We define two graphs as list of edges.
graph1 = [
    ("a", "b"),
    ("b", "c"),
    ("b", "X"),
    ("X", "c"),
    ("c", "d"),
    ("d", "e"),
    ("0", "b"),
]
graph2 = [
    ("a", "b"),
    ("b", "c"),
    ("b", "X"),
    ("X", "c"),
    ("c", "t"),
    ("t", "d"),
    ("d", "e"),
    ("d", "g"),
]

# We convert them into objects GraphDistance.
graph1 = GraphDistance(graph1)
graph2 = GraphDistance(graph2)

distance, graph = graph1.distance_matching_graphs_paths(graph2, use_min=False)

print("distance", distance)
print("common paths:", graph)

>>>

    distance 0.3318250377073907
    common paths: 0
    X
    a
    b
    c
    d
    e
    00
    11
    g
    t
    a -> b []
    b -> c []
    b -> X []
    X -> c []
    c -> d []
    d -> e []
    0 -> b []
    00 -> a []
    00 -> 0 []
    e -> 11 []
    c -> 2a.t []
    2a.t -> d []
    d -> 2a.g []
    2a.g -> 11 []

(entrée originale : graph_distance.py:docstring of mlstatpy.graph.graph_distance.GraphDistance, line 3)

Stochastic Gradient Descent applied to linear regression

The following example how to optimize a simple linear regression.

<<<

import numpy
from mlstatpy.optim import SGDOptimizer


def fct_loss(c, X, y):
    return numpy.linalg.norm(X @ c - y) ** 2


def fct_grad(c, x, y, i=0):
    return x * (x @ c - y) * 0.1


coef = numpy.array([0.5, 0.6, -0.7])
X = numpy.random.randn(10, 3)
y = X @ coef

sgd = SGDOptimizer(numpy.random.randn(3))
sgd.train(X, y, fct_loss, fct_grad, max_iter=15, verbose=True)
print("optimized coefficients:", sgd.coef)

>>>

    0/15: loss: 13.1 lr=0.1 max(coef): 0.6 l1=0/1.5 l2=0/0.77
    1/15: loss: 3.101 lr=0.0302 max(coef): 0.6 l1=0.058/1.2 l2=0.0013/0.69
    2/15: loss: 0.5244 lr=0.0218 max(coef): 0.71 l1=0.088/1.5 l2=0.0028/0.8
    3/15: loss: 0.5874 lr=0.018 max(coef): 0.72 l1=0.016/1.5 l2=9.7e-05/0.87
    4/15: loss: 0.5043 lr=0.0156 max(coef): 0.71 l1=0.0098/1.6 l2=3.9e-05/0.89
    5/15: loss: 0.4271 lr=0.014 max(coef): 0.71 l1=0.03/1.6 l2=0.00041/0.9
    6/15: loss: 0.3292 lr=0.0128 max(coef): 0.7 l1=0.059/1.6 l2=0.0012/0.92
    7/15: loss: 0.2226 lr=0.0119 max(coef): 0.69 l1=0.0043/1.6 l2=7.5e-06/0.92
    8/15: loss: 0.1844 lr=0.0111 max(coef): 0.69 l1=0.003/1.6 l2=3.6e-06/0.94
    9/15: loss: 0.1643 lr=0.0105 max(coef): 0.68 l1=0.032/1.6 l2=0.00035/0.95
    10/15: loss: 0.1488 lr=0.00995 max(coef): 0.68 l1=0.028/1.6 l2=0.00027/0.95
    11/15: loss: 0.1346 lr=0.00949 max(coef): 0.67 l1=0.034/1.7 l2=0.00042/0.95
    12/15: loss: 0.121 lr=0.00909 max(coef): 0.67 l1=0.035/1.7 l2=0.00045/0.96
    13/15: loss: 0.1124 lr=0.00874 max(coef): 0.67 l1=0.0077/1.7 l2=2.3e-05/0.96
    14/15: loss: 0.1031 lr=0.00842 max(coef): 0.66 l1=0.066/1.7 l2=0.0022/0.97
    15/15: loss: 0.09233 lr=0.00814 max(coef): 0.66 l1=0.026/1.7 l2=0.00028/0.98
    optimized coefficients: [ 0.409  0.664 -0.609]

(entrée originale : sgd.py:docstring of mlstatpy.optim.sgd.SGDOptimizer, line 34)

Stochastic Gradient Descent applied to linear regression

The following example how to optimize a simple linear regression.

<<<

import numpy
from mlstatpy.optim import SGDOptimizer


def fct_loss(c, X, y):
    return numpy.linalg.norm(X @ c - y) ** 2


def fct_grad(c, x, y, i=0):
    return x * (x @ c - y) * 0.1


coef = numpy.array([0.5, 0.6, -0.7])
X = numpy.random.randn(10, 3)
y = X @ coef

sgd = SGDOptimizer(numpy.random.randn(3))
sgd.train(X, y, fct_loss, fct_grad, max_iter=15, verbose=True)
print("optimized coefficients:", sgd.coef)

>>>

    0/15: loss: 13.54 lr=0.1 max(coef): 0.54 l1=0/1.1 l2=0/0.59
    1/15: loss: 6.621 lr=0.0302 max(coef): 0.58 l1=0.16/1.4 l2=0.015/0.67
    2/15: loss: 2.802 lr=0.0218 max(coef): 0.67 l1=0.13/1.4 l2=0.0089/0.8
    3/15: loss: 1.808 lr=0.018 max(coef): 0.82 l1=0.17/1.5 l2=0.012/0.99
    4/15: loss: 1.404 lr=0.0156 max(coef): 0.87 l1=0.073/1.5 l2=0.0031/1.1
    5/15: loss: 1.124 lr=0.014 max(coef): 0.87 l1=0.089/1.5 l2=0.0032/1.1
    6/15: loss: 0.9209 lr=0.0128 max(coef): 0.86 l1=0.041/1.6 l2=0.00057/1.1
    7/15: loss: 0.7814 lr=0.0119 max(coef): 0.85 l1=0.069/1.6 l2=0.0023/1.1
    8/15: loss: 0.6718 lr=0.0111 max(coef): 0.84 l1=0.061/1.6 l2=0.0017/1.1
    9/15: loss: 0.6003 lr=0.0105 max(coef): 0.84 l1=0.011/1.7 l2=6e-05/1.1
    10/15: loss: 0.5376 lr=0.00995 max(coef): 0.83 l1=0.043/1.7 l2=0.00076/1.1
    11/15: loss: 0.4719 lr=0.00949 max(coef): 0.82 l1=0.037/1.7 l2=0.00083/1.1
    12/15: loss: 0.4072 lr=0.00909 max(coef): 0.81 l1=0.065/1.7 l2=0.0014/1.1
    13/15: loss: 0.35 lr=0.00874 max(coef): 0.79 l1=0.06/1.7 l2=0.0012/1.1
    14/15: loss: 0.3061 lr=0.00842 max(coef): 0.78 l1=0.038/1.7 l2=0.00069/1.1
    15/15: loss: 0.2722 lr=0.00814 max(coef): 0.78 l1=0.024/1.7 l2=0.00024/1.1
    optimized coefficients: [ 0.284  0.629 -0.781]

(entrée originale : sgd.py:docstring of mlstatpy.optim.sgd.SGDOptimizer, line 34)