Langfuse is accurate or not?

I’m using Langfuse, deployed locally, to trace sessions and monitor costs.

However, I’ve noticed that the reported cost isn’t accurate. For every request, the logs show around 6,000 tokens in total, but in the Langfuse UI, it shows over 16,000 tokens.

Has anyone encountered a similar issue?

Here’s a sample log:

2|knowhow-be | DEBUG It looks like you typed “hellow orld”—maybe you meant “hello world”?
42|knowhow-be | If you need help with something specific (like data analysis, Jira
42|knowhow-be | queries, or anything at Vietnam), just
42|knowhow-be | let me know your request!
42|knowhow-be | DEBUG ************************ METRICS *************************
42|knowhow-be | DEBUG * Tokens: input=6051, output=57, total=6108,
42|knowhow-be | cached=2304
42|knowhow-be | DEBUG * Prompt tokens details: {‘audio_tokens’: 0, ‘cached_tokens’: 2304}
42|knowhow-be | DEBUG * Completion tokens details: {‘accepted_prediction_tokens’: 0,
42|knowhow-be | ‘audio_tokens’: 0, ‘reasoning_tokens’: 0,
42|knowhow-be | ‘rejected_prediction_tokens’: 0}
42|knowhow-be | DEBUG * Time: 0.9879s
42|knowhow-be | DEBUG * Tokens per second: 57.6981 tokens/s
42|knowhow-be | DEBUG * Time to first token: 0.4884s
42|knowhow-be | DEBUG ************************ METRICS *************************
42|knowhow-be | DEBUG ------------- Azure Async Response Stream End --------------
42|knowhow-be | DEBUG Added RunResponse to Memory
42|knowhow-be | DEBUG *** Agent Run End: baf78781-4baa-4d17-9296-08ca1cd85b6b ***

And here’s the corresponding entry from Langfuse:

98acb242-7fa8-4909-b0c1-472a839d6cb0
2025-10-10 10:49:58
Duration: 2.00s
Environment: default
Cost: $0.079926
Tokens: 9,007 → 7,739 (∑ 16,746)

Hey @hunglv53, thanks for reaching out and supporting Agno.
I’ve shared this with the team, we’re working through all requests one by one and will get back to you soon. If it’s urgent, please let us know. We appreciate your patience!

Hi @hunglv53, we are sorry for the delay here. Our engineer, @yash will be here to help you out quickly

Hi @hunglv53 ! The token information displayed in your Agno logs is captured directly from the model response. Ideally the langfuse trace should show the same information. Would it be possible for you to share both your agno and langfuse version? I would like to try and replicate this bug