-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
memory leak? #83
Comments
Does the model reliably respond in under 5 seconds? based on your error message it looks like it may be taking longer and thus slowly accumulating listeners until it reaches it's max and throws those errors. |
It does reliably respond, yes, but that might take a little more then a couple seconds. While i fixed it locally by only asking the models something once every minute (didn't need finer grained), it's still something that should probably be looked at. |
If you are making requests every 5 seconds, and it takes 6 seconds to respond then you lose a second each time, so after 5 requests there is two open listeners at all times, that will slowly build up over time, and if it takes 10 seconds to respond then it would have a additional open listener every 5 seconds, that is not anything that can be fixed anywhere but in your code, that is why it gives you the error message. |
I'm not so sure about that. The model responds within 1 second and i'm repeating the same question over and over again every 5 seconds. The code i'm using is exactly as in my initial post. Why don't you try it out and see for yourself. The issue literally can't be in my code as i'm not creating any listeners. I'm only calling |
I see what's happening I believe it is related to upstream issues whatwg/fetch#1287 |
That to me is a major motivation to ditch nodejs and play with Rust or even C++. Not because of this very issue or because of using js-tech but because it's mind boggling to me that a fix like this has to go to the spec-bandwagon before the fix finds it's way in the implementations. You're easily talking about a decade (before all implementations are fixed). Browser-land (spec-wise) moves at a snails pace for fixes (it's faster for new stuff). Look at the bug report, it's about to hit 4 years in a couple months and that's just the reporting side of things, nothing has been done spec-wise to fix it yet (that i could find). In Rust or C++ land you'd just use a different networking library or make a fix and recompile yourself. Oh well, enough drifting. It's up to you what you do with this bug report. |
@markg85 to solve this problem you can change the number of
The above will do the trick. Can also be configured individually but I suspect some changes to Finally, make sure you monitor the usage of the node process as changing https://nodejs.org/api/events.html#eventsdefaultmaxlisteners |
Hi,
I was running a chat in a loop every 5 seconds.
After a couple minutes i get:
Note that
101
in there, it increases every 5 seconds too so i'm fairly sure this is caused by ollama-js.You might be wondering, why run this in a loop?
Simple really, i have some data - that changes rapidly - that i want to ask a model about.
Am i doing something wrong here? Should i be using a different function?
The text was updated successfully, but these errors were encountered: