The best thing about OpenLLMetry is that it supports a wide range of LLMs and vector DBs out of the box. You just install the SDK and get metrics, traces and logs - without any extra work.

Checkout the list of supported systems on Python and on Typescript.

If your favorite vector DB or LLM is not supported by OpenLLMetry, you can still use OpenLLMetry to report the LLM and vector DB calls manually. Please open an issue for us as well so we can prioritize adding support for your favorite system.

Here’s how you can do that manually in the meantime:

Reporting Vector DB calls

To track a call to a vector DB, just wrap that call in your code with the withVectorDBCall function. This function passes a parameter you can use to report the query vector as well as the results from this call.

import * as traceloop from "@traceloop/node-server-sdk";

const results = await traceloop.withVectorDBCall(
    { vendor: "elastic", type: "query" },
    async ({ span }) => {
      span.reportQuery({ queryVector: [1, 2, 3] });

      // call the vector DB like you normally would
      const results = await client.knnSearch({
        ...
      });

      span.reportResults({
        results: [
          {
            ids: "1",
            scores: 0.5,
            distances: 0.1,
            metadata: { key: "value" },
            vectors: [1, 2, 3],
            documents: "doc",
          },
        ],
      });

      return results;
    },
  );