Roadmap
Langfuse is open source and we want to be fully transparent what we're working on and what's next. This roadmap is a living document and we'll update it as we make progress.
Your feedback is highly appreciated
- Feel like something is missing? Add new ideas on GitHub or vote on existing ones. Both are a great way to contribute to Langfuse and help us understand what is important to you.
🚀 Released
10 most recent changelog items:
- PostHog integration
- OpenAI SDK integration for JS/TS SDK
- Docker images on Docker Hub
- ISO 27001 certification
- OpenAI Integration tracks used Langfuse Prompts
- ChatML in Prompt Management
- Support for LangChain async, batch and streaming interfaces
- Decorator-based tracing for Python
- Improved ChatML message rendering
- Token/cost tracking for Claude 3 models
Subscribe to our mailing list to get occasional email updates about new features.
🧪 Pre-release
- Evaluation service to run custom model-based evals on historical and newly ingested traces (docs)
- Improved datasets UI/UX
Please reach out if you are interested in testing these features before they are released to provide feedback and help shape the future of Langfuse: early-access@langfuse.com.
🚧 In progress
- Manual evaluations in Langfuse UI along multiple dimensions, currently only a single dimension is supported
- Prompt playground connected to datasets and prompt management
- Infrastructure: queued ingestion to handle large and spiky loads on small instances
- Improved tables across the Langfuse UI to display all relevant information and be more user-friendly
- Move to SDK references generated from docstrings to improve the developer experience (Intellisense) and reduce the risk of errors
- SOC2 (Type 2) certification for Langfuse Cloud
🔮 Planned
- Datasets: make them usable in CI (e.g GitHub Actions)
- Prompt management: multiple environments, comments on versions
- Infrastructure: add OLAP database for faster analytical queries
⚠️ Upcoming breaking changes
- OpenAI integration, dropping support of
openai < 1.0.0
to greatly simplify the integration and improve the developer experience of everyone onopenai >= 1
- Self-hosting: langfuse will move from a single container to a multi-container setup (queues, async worker, OLAP database) to improve scalability and reliability. We will publish an extensive guide once the changes are in pre-release to help everyone migrate.
🙏 Feature requests and bug reports
The best way to support Langfuse is to share your feedback, report bugs, and upvote on ideas suggested by others.