Я пытаюсь вписать объект TfidfVectorizer в список обзоров видеоигр, но по какой-то причине я получаю ошибку.
Вот мой код:
from sklearn.feature_extraction.text import TfidfVectorizer
tfidf_vectorizer = TfidfVectorizer(max_features = 50000, use_idf = True, ngram_range=(1,3),
preprocessor = data_preprocessor.preprocess_tokenized_review)
print(train_set_x[0])
%time tfidf_matrix = tfidf_vectorizer.fit_transform(train_set_x)
И вотсообщение об ошибке:
I haven't gotten around to playing the campaign but the multiplayer is solid and pretty fun. Includes Zero Dark Thirty pack, an Online Pass, and the all powerful Battlefield 4 Beta access.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<timed exec> in <module>()
~/anaconda3/lib/python3.6/site-packages/sklearn/feature_extraction/text.py in fit_transform(self, raw_documents, y)
1379 Tf-idf-weighted document-term matrix.
1380 """
-> 1381 X = super(TfidfVectorizer, self).fit_transform(raw_documents)
1382 self._tfidf.fit(X)
1383 # X is already a transformed view of raw_documents so
~/anaconda3/lib/python3.6/site-packages/sklearn/feature_extraction/text.py in fit_transform(self, raw_documents, y)
867
868 vocabulary, X = self._count_vocab(raw_documents,
--> 869 self.fixed_vocabulary_)
870
871 if self.binary:
~/anaconda3/lib/python3.6/site-packages/sklearn/feature_extraction/text.py in _count_vocab(self, raw_documents, fixed_vocab)
790 for doc in raw_documents:
791 feature_counter = {}
--> 792 for feature in analyze(doc):
793 try:
794 feature_idx = vocabulary[feature]
~/anaconda3/lib/python3.6/site-packages/sklearn/feature_extraction/text.py in <lambda>(doc)
264
265 return lambda doc: self._word_ngrams(
--> 266 tokenize(preprocess(self.decode(doc))), stop_words)
267
268 else:
~/anaconda3/lib/python3.6/site-packages/sklearn/feature_extraction/text.py in <lambda>(doc)
239 return self.tokenizer
240 token_pattern = re.compile(self.token_pattern)
--> 241 return lambda doc: token_pattern.findall(doc)
242
243 def get_stop_words(self):
TypeError: expected string or bytes-like object
Обратите внимание, что первая часть вывода представляет собой один из обзоров из моего набора данных видеоигр.Если кто-нибудь знает, что происходит, я был бы признателен за помощь.Заранее спасибо!