Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to bind of the private ec2 instance ip in ollama service file to restrict the access #4263

Open
devivaraprasad901 opened this issue May 8, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@devivaraprasad901
Copy link

devivaraprasad901 commented May 8, 2024

What is the issue?

Unable to bind of the private ec2 instance ip in ollama service file to restrict the access ,which is using private VPC
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_HOST=<private ec2 instance ip(x.x.x.x>"

[Install]
WantedBy=default.target
~
~
~
~
~
~
~
~
~
~
~
~
"/etc/systemd/system/ollama.service" [readonly] 14L, 238B

When I start the service it is not coming upsudo systemctl status ollama
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
Active: activating (auto-restart) (Result: exit-code) since Wed 2024-05-08 18:31:20 UTC; 2s ago
Process: 311875 ExecStart=/usr/local/bin/ollama serve (code=exited, status=1/FAILURE)
Main PID: 311875 (code=exited, status=1/FAILURE)
CPU: 13ms

    SYSLOG:
    2024-05-08T18:06:46.719183+00:00 ip-xx-xx-x-xxsystemd[1]: Started ollama.service - Ollama Service.

2024-05-08T18:06:46.729714+00:00 ip-xx-xx-xx-xx ollama[307072]: Error: listen tcp x.x.x.x:11434: bind: cannot assign requested address
2024-05-08T18:06:46.731993+00:00 ip-xx-xxx-x-xxsystemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
2024-05-08T18:06:46.732178+00:00 ip-x-x-x-xsystemd[1]: ollama.service: Failed with result 'exit-code'.

I am having two ec2 instances :
One instance having ollama installed with LLAMA3 model running
Another Ec2 instance having LLM inference code , which store access keys

Application code is hosted on another 3rd EC2 instance which access the LLAMA3 instance using LLM inference keys.

Here my goal is to set the LLM inference private IP in LAMA3 EC2 instance under OLLAM service file.

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

ollama version is 0.1.32

@devivaraprasad901 devivaraprasad901 added the bug Something isn't working label May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant