chatGPT Cook Book_7 Texts Comparison and Utilization

use embeddings for search is shown in Semantic_text_search_using_embeddings.ipynb.

cheaply search a corpus of documents for relevant information and then give that information to GPT-3, via the prompt, to answer a question. We demonstrate in Question_answering_using_embeddings.ipynb.

for recommendations is shown in Recommendation_using_embeddings.ipynb. Similar to search, these cosine similarity scores can either be used on their own to rank items or as features in larger ranking algorithms.

In Customizing_embeddings.ipynb, we provide an example method for customizing your embeddings using training data. The idea of the method is to train a custom matrix to multiply embedding vectors by in order to get new customized embeddings. With good training data, this custom matrix will help emphasize the features relevant to your training labels. You can equivalently consider the matrix multiplication as (a) a modification of the embeddings or (b) a modification of the distance function used to measure the distances between embeddings.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.