Prétraitement du texte¶
Le texte libre n’est jamais simple. A chaque environnement son langage. On n’écrit pas sur un réseau social comme dans un dictionnaire. Comment convertir cela en numérique ? That is the question.
Bog of Words¶
C’est le début de tout. La première étape consiste à découper un texte en token (caractères, mots, …). Le plus souvent, c’est en mot. Chaque mot reçoit un identifiant. Une phrase est transformée en une liste d’entiers.
L’approche la plus simple consiste ensuite un vecteur binaire pour chaque mot .
est le nombre de mots reconnus par le modèle.
, il vaut 1 si i est égale à son numéro, 0 sinon. Chaque vecteur ne contient qu’un seul un.
On fait ensuite la somme pour obtenir un seul vecteur par phrase, quelle que soit la longueur de la phrase : CountVectorizer
[ ]:
from sklearn.feature_extraction.text import CountVectorizer
corpus = [
"J'adore le machine learning et l'intelligence artificielle.",
"Les réseaux de neurones sont puissants pour la vision par ordinateur.",
"Scikit-learn est une super bibliothèque pour le ML.",
"Transformer est un modèle de deep learning très utilisé.",
]
vectorizer = CountVectorizer(max_features=10)
X = vectorizer.fit_transform(corpus)
print("Shape de la matrice TF-IDF :", X.shape)
Shape de la matrice TF-IDF : (4, 10)
[10]:
X.todense()
[10]:
matrix([[1, 0, 0, 0, 0, 1, 1, 1, 1, 0],
[0, 0, 1, 0, 0, 0, 0, 0, 0, 1],
[0, 1, 0, 0, 1, 0, 0, 1, 0, 1],
[0, 0, 1, 1, 1, 0, 0, 0, 1, 0]])
0 ou 1, peut-on faire mieux ? C’est l’objectif de l’approche TF-IDF, donner un poids à chaque mot qui dépend de sa fréquence dans le document, et de sa fréquence dans l’ensemble des documents. On souhaite éliminer les mots rares, trop rares pour être significatifs, ou trop fréquent comme les stop-words, si fréquents que les enlever n’enlève au sens de la phrase : TfIdfVectorizer.
[14]:
from sklearn.feature_extraction.text import TfidfVectorizer
corpus = [
"J'adore le machine learning et l'intelligence artificielle.",
"Les réseaux de neurones sont puissants pour la vision par ordinateur.",
"Scikit-learn est une super bibliothèque pour le ML.",
"Transformer est un modèle de deep learning très utilisé.",
]
vectorizer = TfidfVectorizer(max_features=10)
X = vectorizer.fit_transform(corpus)
print("Shape de la matrice TF-IDF :", X.shape)
X.todense()
Shape de la matrice TF-IDF : (4, 10)
[14]:
matrix([[0.48546061, 0. , 0. , 0. , 0. ,
0.48546061, 0.48546061, 0.38274272, 0.38274272, 0. ],
[0. , 0. , 0.70710678, 0. , 0. ,
0. , 0. , 0. , 0. , 0.70710678],
[0. , 0.59081908, 0. , 0. , 0.46580855,
0. , 0. , 0.46580855, 0. , 0.46580855],
[0. , 0. , 0.46580855, 0.59081908, 0.46580855,
0. , 0. , 0. , 0.46580855, 0. ]])
[15]:
vectorizer.get_feature_names_out()
[15]:
array(['artificielle', 'bibliothèque', 'de', 'deep', 'est', 'et',
'intelligence', 'le', 'learning', 'pour'], dtype=object)
n-grammes¶
Ces deux approches ne donne aucun poids à l’ordre des mots. Changer l’ordre ne change rien. Pour le remettre, on utilise des n-grammes, cela revient à considérer les mots, les couples de mots, les triplets de mots…
[16]:
from sklearn.feature_extraction.text import TfidfVectorizer
corpus = [
"J'adore le machine learning et l'intelligence artificielle.",
"Les réseaux de neurones sont puissants pour la vision par ordinateur.",
"Scikit-learn est une super bibliothèque pour le ML.",
"Transformer est un modèle de deep learning très utilisé.",
]
vectorizer = TfidfVectorizer(max_features=10, ngram_range=(1, 2))
X = vectorizer.fit_transform(corpus)
print("Shape de la matrice TF-IDF :", X.shape)
X.todense()
Shape de la matrice TF-IDF : (4, 10)
[16]:
matrix([[0.66767854, 0. , 0. , 0. , 0. ,
0. , 0. , 0.52640543, 0.52640543, 0. ],
[0. , 0. , 0. , 0.52640543, 0.66767854,
0. , 0. , 0. , 0. , 0.52640543],
[0. , 0.50867187, 0.50867187, 0. , 0. ,
0. , 0.40104275, 0.40104275, 0. , 0.40104275],
[0. , 0. , 0. , 0.46580855, 0. ,
0.59081908, 0.46580855, 0. , 0.46580855, 0. ]])
[17]:
vectorizer.get_feature_names_out()
[17]:
array(['artificielle', 'bibliothèque', 'bibliothèque pour', 'de',
'de neurones', 'deep', 'est', 'le', 'learning', 'pour'],
dtype=object)
Deep Learning¶
Au final, il s’agit de compresser des phrases dans un espace vectoriel numérique. Plus on a de texte, plus on peut apprendre des compressions efficaces. Le deep learning, la puissance de calcul vient à la rescousse. Une approche populaire est word2vec. Un autre package textblob propose d’enrichir les phrases en taggant les mots (nom, verbe, …). Il y a aussi spacy, NLTK.
Le plus efficace est sans doute d’utiliser un modèle de deep learning entraîné à faire une tâche proche du problème de prédiction à résoudre.
[20]:
import torch
import numpy as np
from transformers import AutoTokenizer, AutoModel
# Charger le tokenizer et le modèle
MODEL_NAME = "google/bert_uncased_L-2_H-128_A-2"
tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME)
model = AutoModel.from_pretrained(MODEL_NAME)
[21]:
def get_embedding(text):
"""Convertir un texte en embedding."""
tokens = tokenizer(text, return_tensors="pt", padding=True, truncation=True)
with torch.no_grad():
output = model(**tokens)
return output.last_hidden_state[:, 0, :].numpy() # Prendre le CLS token
# Appliquer à notre dataset
X_bert = np.vstack([get_embedding(t) for t in corpus])
print("Shape des embeddings BERT :", X_bert.shape) # (5, 768)
Asking to truncate to max_length but no maximum length is provided and the model has no predefined maximum length. Default to no truncation.
Shape des embeddings BERT : (4, 128)
[22]:
X_bert
[22]:
array([[-1.01954079e+00, 1.09865868e+00, -3.24030900e+00,
-1.63258791e+00, -1.05619824e+00, -1.61958218e-01,
5.81856966e-01, 1.30005205e+00, -8.13747764e-01,
-1.70176730e-01, 9.68299985e-01, -1.26105383e-01,
1.74509242e-01, -1.15167648e-01, 1.64889967e+00,
-5.84569156e-01, 7.68755794e-01, 1.78837568e-01,
-1.32599080e+00, 1.69023693e-01, -1.79380298e-01,
-5.71163356e-01, 3.29246068e+00, 6.05863452e-01,
1.45341444e+00, -1.82031751e-01, 3.90212893e-01,
6.18985474e-01, 6.09537959e-01, -1.06283987e+00,
-3.49633962e-01, -4.75799978e-01, -2.02315283e+00,
2.34588310e-01, 3.09544921e-01, -1.70815694e+00,
-4.13940072e-01, 1.76980734e-01, -3.81380868e+00,
-5.86692035e-01, 3.51953477e-01, 3.45019139e-02,
1.19410813e+00, -1.84602499e+00, -1.17189325e-01,
-1.89737916e+00, -1.35886836e+00, -1.02250898e+00,
1.80050611e-01, -4.42315340e-01, -1.75411665e+00,
2.22501710e-01, -4.22832400e-01, -4.16091889e-01,
5.51143229e-01, -9.70044315e-01, 1.22573209e+00,
8.65920365e-01, -6.82873189e-01, 1.71250188e+00,
7.08095312e-01, -4.03583437e-01, -3.61105740e-01,
-7.96923161e-01, -7.38765001e-01, 9.53389943e-01,
-9.91858184e-01, -5.60370147e-01, -2.69480467e-01,
-1.69493437e+00, 1.14979811e-01, 3.37393843e-02,
6.00349307e-01, 3.67366642e-01, 2.14658469e-01,
-3.17729563e-01, 6.11521780e-01, 1.06701374e+00,
2.32848287e+00, -1.33116812e-01, 9.25856054e-01,
-4.24337327e-01, 4.42405224e-01, 5.44987321e-01,
-1.10938144e+00, -4.47214723e-01, 1.32186219e-01,
1.31740761e+00, 1.88220763e+00, -2.60719705e+00,
1.83671737e+00, 7.52062678e-01, -1.68656397e+00,
1.63074791e+00, -2.21512839e-01, 2.07889438e+00,
-7.53570378e-01, -1.04282379e-01, -5.03074288e-01,
1.27921963e+00, -9.84446704e-01, 1.13481379e+00,
5.14088631e-01, -4.60754372e-02, 4.31375206e-01,
-1.44148731e+00, 3.97931218e-01, 6.03498995e-01,
1.02442682e+00, -1.19910133e+00, 7.62686551e-01,
7.26533830e-01, -8.08191001e-01, 5.69890499e-01,
-3.59242469e-01, 1.45955420e+00, -1.04817414e+00,
-9.74340081e-01, 4.09182042e-01, 1.28081393e+00,
4.36158776e-02, 4.89389658e-01, 9.87771630e-01,
-1.12217844e+00, -6.38612211e-02, -1.61485445e+00,
-1.66994941e+00, -7.72857726e-01],
[-6.83127999e-01, -5.56167126e-01, -3.63867450e+00,
-1.28199339e+00, -1.20123577e+00, 1.96135312e-01,
-4.88897085e-01, 9.48321402e-01, -2.59338289e-01,
-3.53679687e-01, 1.32931364e+00, -8.70594501e-01,
-5.59956849e-01, 4.18603607e-03, 2.12153935e+00,
1.01919997e+00, -2.05129668e-01, -4.72437859e-01,
-1.47526550e+00, -3.87323767e-01, 2.03684513e-02,
-9.44388568e-01, 2.24987054e+00, 2.25593001e-01,
9.02955890e-01, -1.99352992e+00, 1.35320649e-01,
3.70547414e-01, 4.04503971e-01, -5.46593189e-01,
1.19153671e-01, 1.05680227e-01, -1.11140954e+00,
2.82122195e-01, 4.32250321e-01, -1.94234884e+00,
-5.09086430e-01, 1.95005760e-02, -3.39564824e+00,
-1.70796132e+00, 9.99035895e-01, -6.17129385e-01,
7.32943058e-01, -1.73209143e+00, 5.00957131e-01,
-5.24365485e-01, -2.96006262e-01, -3.38728994e-01,
-1.52206913e-01, -9.00493562e-01, -1.55217767e+00,
2.38435769e+00, 3.12831938e-01, 2.22100094e-01,
1.25556505e+00, -8.43219757e-01, 6.92592323e-01,
1.26021349e+00, -4.11273986e-01, 7.33420730e-01,
2.10238791e+00, -6.01617038e-01, -1.96537346e-01,
-3.18308622e-01, -6.32068634e-01, 1.57762304e-01,
-5.37661433e-01, 4.19160098e-01, 1.26220107e+00,
-1.15943038e+00, -1.20003676e+00, -1.41549408e-01,
5.64820409e-01, -1.82437614e-01, -9.09673512e-01,
-1.24141254e-01, 1.61712325e+00, 1.06940150e+00,
1.89118302e+00, -5.24994254e-01, 9.35502708e-01,
8.79120827e-01, 1.02002311e+00, 7.60220468e-01,
-7.93807507e-01, 4.03508134e-02, -2.12649837e-01,
8.24379146e-01, 2.07996225e+00, -1.55340850e+00,
3.97868395e-01, -1.95578784e-01, -1.02845728e+00,
1.40823638e+00, -5.12194753e-01, 3.42124462e-01,
-4.75329906e-01, -2.32395753e-02, 2.39902452e-01,
1.66990519e+00, -1.61591375e+00, 2.46869326e-01,
7.44110823e-01, -4.68034178e-01, 2.53083080e-01,
-2.05041742e+00, -6.50393724e-01, 3.14676970e-01,
9.03494239e-01, -1.20414937e+00, 5.68512917e-01,
1.18558300e+00, 1.06637108e+00, 5.49683034e-01,
-6.36744082e-01, 9.34837282e-01, -1.72392225e+00,
-1.48603058e+00, 2.78058518e-02, 1.90578401e-01,
-3.07050020e-01, 1.32692322e-01, 2.69899511e+00,
-1.28229845e+00, -8.16217601e-01, -9.63375986e-01,
-1.29919481e+00, 1.30306625e+00],
[-8.12666893e-01, 6.52709305e-02, -3.07674003e+00,
-1.73006678e+00, 5.56517914e-02, -2.04863325e-02,
1.05746277e-01, 1.20778036e+00, -3.19906652e-01,
5.40784001e-01, 1.56712031e+00, 3.82446766e-01,
9.02545676e-02, -2.24934638e-01, 2.30759454e+00,
-2.45177016e-01, -8.60414624e-01, -1.13826700e-01,
-1.02362096e+00, -6.76768646e-02, 3.67623121e-02,
-1.40830851e+00, 2.92808414e+00, 1.04080420e-02,
1.06485045e+00, -9.46118176e-01, 8.19238722e-01,
1.21418603e-01, 1.02423477e+00, -6.36593938e-01,
5.21099150e-01, -1.33833904e-02, -1.45841122e+00,
2.64952987e-01, 9.21794057e-01, -9.53970313e-01,
-1.35562587e+00, -1.16663344e-01, -3.58844757e+00,
-1.64637232e+00, 1.40329659e-01, -6.69219315e-01,
1.44580746e+00, -1.06386876e+00, 3.14496487e-01,
-6.42086446e-01, -2.83296406e-01, -5.64796329e-01,
-5.01917839e-01, -1.27303565e+00, -1.45242953e+00,
8.28907609e-01, -2.32205883e-01, -7.54631162e-01,
1.02804494e+00, -7.21767187e-01, 1.24999726e+00,
1.18172979e+00, -2.79485136e-01, 1.13859248e+00,
1.60498476e+00, -4.04708713e-01, -2.41502494e-01,
-4.90335375e-01, -6.52110800e-02, 4.24102694e-01,
-1.09363198e+00, -8.64771158e-02, -1.83021978e-01,
-1.91083729e+00, -9.95639563e-01, -1.69918492e-01,
-7.35301673e-01, 1.50415108e-01, -8.40810597e-01,
1.99823573e-01, 1.45653582e+00, 1.92442417e+00,
1.98283410e+00, -3.56586248e-01, 1.22377563e+00,
2.72677988e-01, 7.04353154e-01, 6.59684837e-01,
-1.23798490e+00, -7.79708922e-01, -7.47595191e-01,
4.59805787e-01, 1.99949372e+00, -1.52481484e+00,
7.26100147e-01, 5.82079999e-02, -1.10623360e+00,
1.40464211e+00, -4.88754869e-01, 1.36618829e+00,
-5.92888713e-01, 3.41841877e-01, -3.47859114e-01,
1.97709596e+00, -1.99714160e+00, 1.37554669e+00,
2.04651952e-01, -1.66606218e-01, 2.18841210e-01,
-1.91836238e+00, 4.76566344e-01, 9.21216786e-01,
6.81123912e-01, -1.39042044e+00, 8.64243329e-01,
2.71000504e-01, 4.00340766e-01, 1.04092801e+00,
-5.15078187e-01, 9.91614044e-01, -1.30341268e+00,
-1.20916986e+00, 7.31747568e-01, 1.17832410e+00,
-8.35951507e-01, 1.13972016e-01, 6.74843788e-01,
-1.64233255e+00, -1.06314278e+00, -1.60340345e+00,
-1.14451480e+00, 7.63953149e-01],
[-1.12000859e+00, 1.11265957e+00, -3.82756567e+00,
-1.48491859e+00, 1.63687449e-02, 4.55627501e-01,
-5.65744452e-02, 1.27985466e+00, -7.67126024e-01,
-2.81498551e-01, 8.97506416e-01, -1.42197534e-02,
-8.19370300e-02, -2.26925492e-01, 1.66618085e+00,
-6.61452532e-01, -3.70589644e-01, -1.09999168e+00,
-1.39308095e+00, 8.07408360e-04, -2.12185517e-01,
-9.45967555e-01, 2.69539142e+00, 5.93376160e-01,
2.22086072e+00, -4.85737562e-01, 1.40290707e-01,
-1.10392824e-01, 1.46630216e+00, -3.17616343e-01,
1.32928157e+00, -1.87262982e-01, -1.42463541e+00,
3.55866700e-01, 2.60791987e-01, -1.61093724e+00,
-1.45970356e+00, 6.16086684e-02, -3.01096487e+00,
-8.79962981e-01, 6.04537249e-01, -1.22271383e+00,
1.07631767e+00, -1.58503139e+00, 4.00953680e-01,
-1.12085891e+00, -6.88587129e-02, -5.49067140e-01,
3.99110794e-01, -4.31001872e-01, -1.07458723e+00,
1.11806774e+00, -4.03885245e-01, -3.39854173e-02,
1.46387565e+00, -1.34119225e+00, 5.96230686e-01,
7.74689734e-01, -1.07978559e+00, 2.73811936e+00,
7.24591374e-01, 7.71366835e-01, -1.01817943e-01,
-1.26372945e+00, -9.41565573e-01, 5.54629862e-01,
-6.51445925e-01, 4.10677493e-01, 9.97073576e-02,
-9.63636816e-01, -6.58609033e-01, -4.14109051e-01,
8.77419531e-01, 8.53793398e-02, -1.36762953e+00,
4.69778121e-01, 1.13651264e+00, 7.64670253e-01,
2.27115512e+00, -8.53716016e-01, -7.40047917e-02,
6.82033718e-01, 5.49325943e-01, 6.04129076e-01,
-3.34445417e-01, -8.17370117e-01, 1.93748042e-01,
1.08799136e+00, 2.31607056e+00, -2.03518558e+00,
9.13879573e-01, 3.04467697e-02, -7.94529736e-01,
1.77246606e+00, -5.89831591e-01, 1.34107971e+00,
-9.85443532e-01, 1.08929262e-01, -3.34062755e-01,
1.95911872e+00, -1.28316021e+00, -2.35506403e-03,
2.11453456e-02, 7.92738080e-01, -4.20279115e-01,
-1.61208737e+00, -2.07351923e-01, 3.64274949e-01,
1.55637875e-01, -1.22222984e+00, 3.05164438e-02,
9.23316658e-01, -1.17150033e+00, 2.75564492e-01,
-8.36010635e-01, 1.71956778e+00, -9.15843844e-01,
-7.27239430e-01, 1.12725636e-02, 9.40105245e-02,
-5.55701375e-01, 4.86811817e-01, 1.87932193e+00,
-1.20891237e+00, -6.51138246e-01, -1.93047583e+00,
-1.27677214e+00, 4.32397842e-01]], dtype=float32)
[ ]: