Context Precision

Context Precision Metric

Context Precision measures the proportion of relevant chunks in the retrieved contexts. It is computed as the average of precision@k across all chunks in the context.

Definitions

  • Precision@k: The ratio of relevant chunks (true positives) at rank k to the total number of chunks

Precision@k=true positives@k(true positives@k+false positives@k)\text{Precision@k} = {\text{true positives@k} \over (\text{true positives@k} + \text{false positives@k})}
  • Context Precision is calculated using the formula:

    Context Precision@K=k=1K(Precision@k×vk)Total number of relevant items in the top K results\text{Context Precision@K} = \frac{\sum_{k=1}^{K} \left( \text{Precision@k} \times v_k \right)}{\text{Total number of relevant items in the top } K \text{ results}}

Where:

  • K is the total number of chunks in the retrieved contexts.

  • ( v_k \in {0, 1} ) is the relevance indicator at rank k (1 if relevant, 0 if not).

Summary

Context Precision provides insight into the effectiveness of the retrieved contexts by quantifying how many relevant chunks are identified among the top results.

Example Code: Context Precision Evaluation

This example demonstrates how to compute the Context Precision metric using the ContextPrecisionEvaluator with the OpenAI language model.

Last updated