mlflow-openai - v0.1.0-rc.0
    Preparing search index...

    mlflow-openai - v0.1.0-rc.0

    MLflow Typescript SDK - OpenAI

    Seamlessly integrate MLflow Tracing with OpenAI to automatically trace your OpenAI API calls.

    Package NPM Description
    mlflow-openai npm package Auto-instrumentation integration for OpenAI.
    npm install mlflow-openai
    

    The package includes the mlflow-tracing package and openai package as peer dependencies. Depending on your package manager, you may need to install these two packages separately.

    Start MLflow Tracking Server if you don't have one already:

    pip install mlflow
    mlflow server --backend-store-uri sqlite:///mlruns.db --port 5000

    Self-hosting MLflow server requires Python 3.10 or higher. If you don't have one, you can also use managed MLflow service for free to get started quickly.

    Instantiate MLflow SDK in your application:

    import * as mlflow from 'mlflow-tracing';

    mlflow.init({
    trackingUri: 'http://localhost:5000',
    experimentId: '<experiment-id>'
    });

    Create a trace:

    import { OpenAI } from 'openai';
    import { tracedOpenAI } from 'mlflow-openai';

    // Wrap the OpenAI client with the tracedOpenAI function
    const client = tracedOpenAI(new OpenAI());

    // Invoke the client as usual
    const response = await client.chat.completions.create({
    model: 'o4-mini',
    messages: [
    { role: 'system', content: 'You are a helpful weather assistant.' },
    { role: 'user', content: "What's the weather like in Seattle?" }
    ]
    });

    View traces in MLflow UI:

    MLflow Tracing UI

    Official documentation for MLflow Typescript SDK can be found here.

    This project is licensed under the Apache License 2.0.