[GH-ISSUE #3047] Ollama logging for ConnectionResetError #1874

Closed
opened 2026-04-12 11:57:23 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Bardo-Konrad on GitHub (Mar 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3047

I access ollama using the python library.

It communicates well but after some exchanges I always get the following. It seems that I need to reset ollama via python or maybe context length is surpassed, how do I figure it out?

Traceback (most recent call last):
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 462, in _make_request
    httplib_response = conn.getresponse()
                       ^^^^^^^^^^^^^^^^^^
  File "c:\Lib\http\client.py", line 1386, in getresponse
    response.begin()
  File "c:\Lib\http\client.py", line 325, in begin
    version, status, reason = self._read_status()
                              ^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\http\client.py", line 286, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\socket.py", line 706, in readinto
    return self._sock.recv_into(b)
           ^^^^^^^^^^^^^^^^^^^^^^^
ConnectionResetError: [WinError 10054] Eine vorhandene Verbindung wurde vom Remotehost geschlossen

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\Lib\site-packages\requests\adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\urllib3\util\retry.py", line 550, in increment
    raise six.reraise(type(error), error, _stacktrace)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\urllib3\packages\six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "c:\Lib\site-packages\urllib3\connectionpool.py", line 462, in _make_request
    httplib_response = conn.getresponse()
                       ^^^^^^^^^^^^^^^^^^
  File "c:\Lib\http\client.py", line 1386, in getresponse
    response.begin()
  File "c:\Lib\http\client.py", line 325, in begin
    version, status, reason = self._read_status()
                              ^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\http\client.py", line 286, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\socket.py", line 706, in readinto
    return self._sock.recv_into(b)
           ^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 157, in _process_emb_response
    res = requests.post(
          ^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\requests\api.py", line 115, in post
    return request("post", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\requests\api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\requests\adapters.py", line 501, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\test.py", line 123, in <module>
    rag(ds("documents"), "")
  File "E:\test.py", line 93, in rag
    result = chain.invoke(aufgabe).replace("\n"," ").replace("\r"," ").replace("  "," ")
             ^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2075, in invoke
    input = step.invoke(
            ^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2712, in invoke
    output = {key: future.result() for key, future in zip(steps, futures)}
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2712, in <dictcomp>
    output = {key: future.result() for key, future in zip(steps, futures)}
                   ^^^^^^^^^^^^^^^
  File "c:\Lib\concurrent\futures\_base.py", line 456, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\concurrent\futures\_base.py", line 401, in __get_result
    raise self._exception
  File "c:\Lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\retrievers.py", line 141, in invoke
    return self.get_relevant_documents(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\retrievers.py", line 244, in get_relevant_documents
    raise e
  File "c:\Lib\site-packages\langchain_core\retrievers.py", line 237, in get_relevant_documents
    result = self._get_relevant_documents(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_core\vectorstores.py", line 674, in _get_relevant_documents
    docs = self.vectorstore.similarity_search(query, **self.search_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 348, in similarity_search
    docs_and_scores = self.similarity_search_with_score(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 437, in similarity_search_with_score
    query_embedding = self._embedding_function.embed_query(query)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 217, in embed_query
    embedding = self._embed([instruction_pair])[0]
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 192, in _embed
    return [self._process_emb_response(prompt) for prompt in iter_]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 192, in <listcomp>
    return [self._process_emb_response(prompt) for prompt in iter_]
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 163, in _process_emb_response
    raise ValueError(f"Error raised by inference endpoint: {e}")
ValueError: Error raised by inference endpoint: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None))
Originally created by @Bardo-Konrad on GitHub (Mar 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3047 I access ollama using the python library. It communicates well but after some exchanges I always get the following. It seems that I need to reset ollama via python or maybe context length is surpassed, how do I figure it out? ``` Traceback (most recent call last): File "c:\Lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen httplib_response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request six.raise_from(e, None) File "<string>", line 3, in raise_from File "c:\Lib\site-packages\urllib3\connectionpool.py", line 462, in _make_request httplib_response = conn.getresponse() ^^^^^^^^^^^^^^^^^^ File "c:\Lib\http\client.py", line 1386, in getresponse response.begin() File "c:\Lib\http\client.py", line 325, in begin version, status, reason = self._read_status() ^^^^^^^^^^^^^^^^^^^ File "c:\Lib\http\client.py", line 286, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\socket.py", line 706, in readinto return self._sock.recv_into(b) ^^^^^^^^^^^^^^^^^^^^^^^ ConnectionResetError: [WinError 10054] Eine vorhandene Verbindung wurde vom Remotehost geschlossen During handling of the above exception, another exception occurred: Traceback (most recent call last): File "c:\Lib\site-packages\requests\adapters.py", line 486, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "c:\Lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\urllib3\util\retry.py", line 550, in increment raise six.reraise(type(error), error, _stacktrace) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\urllib3\packages\six.py", line 769, in reraise raise value.with_traceback(tb) File "c:\Lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen httplib_response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request six.raise_from(e, None) File "<string>", line 3, in raise_from File "c:\Lib\site-packages\urllib3\connectionpool.py", line 462, in _make_request httplib_response = conn.getresponse() ^^^^^^^^^^^^^^^^^^ File "c:\Lib\http\client.py", line 1386, in getresponse response.begin() File "c:\Lib\http\client.py", line 325, in begin version, status, reason = self._read_status() ^^^^^^^^^^^^^^^^^^^ File "c:\Lib\http\client.py", line 286, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\socket.py", line 706, in readinto return self._sock.recv_into(b) ^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 157, in _process_emb_response res = requests.post( ^^^^^^^^^^^^^^ File "c:\Lib\site-packages\requests\api.py", line 115, in post return request("post", url, data=data, json=json, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\requests\api.py", line 59, in request return session.request(method=method, url=url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\requests\adapters.py", line 501, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "E:\test.py", line 123, in <module> rag(ds("documents"), "") File "E:\test.py", line 93, in rag result = chain.invoke(aufgabe).replace("\n"," ").replace("\r"," ").replace(" "," ") ^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2075, in invoke input = step.invoke( ^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2712, in invoke output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\runnables\base.py", line 2712, in <dictcomp> output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^ File "c:\Lib\concurrent\futures\_base.py", line 456, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "c:\Lib\concurrent\futures\_base.py", line 401, in __get_result raise self._exception File "c:\Lib\concurrent\futures\thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\retrievers.py", line 141, in invoke return self.get_relevant_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\retrievers.py", line 244, in get_relevant_documents raise e File "c:\Lib\site-packages\langchain_core\retrievers.py", line 237, in get_relevant_documents result = self._get_relevant_documents( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_core\vectorstores.py", line 674, in _get_relevant_documents docs = self.vectorstore.similarity_search(query, **self.search_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 348, in similarity_search docs_and_scores = self.similarity_search_with_score( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\vectorstores\chroma.py", line 437, in similarity_search_with_score query_embedding = self._embedding_function.embed_query(query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 217, in embed_query embedding = self._embed([instruction_pair])[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 192, in _embed return [self._process_emb_response(prompt) for prompt in iter_] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 192, in <listcomp> return [self._process_emb_response(prompt) for prompt in iter_] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "c:\Lib\site-packages\langchain_community\embeddings\ollama.py", line 163, in _process_emb_response raise ValueError(f"Error raised by inference endpoint: {e}") ValueError: Error raised by inference endpoint: ('Connection aborted.', ConnectionResetError(10054, 'Eine vorhandene Verbindung wurde vom Remotehost geschlossen', None, 10054, None)) ```
GiteaMirror added the bugneeds more info labels 2026-04-12 11:57:23 -05:00
Author
Owner

@jmorganca commented on GitHub (Mar 11, 2024):

Hi there, would it be possible to see if you're seeing an error in the logs? On windows you can click on the taskbar icon -> View logs -> open server.log.

<!-- gh-comment-id:1989364895 --> @jmorganca commented on GitHub (Mar 11, 2024): Hi there, would it be possible to see if you're seeing an error in the logs? On windows you can click on the taskbar icon -> View logs -> open `server.log`.
Author
Owner

@Bardo-Konrad commented on GitHub (Mar 11, 2024):

I looked into it and yes, I found

  • CUDA error: out of memory
  • Error #01: write tcp 127.0.0.1:11434->127.0.0.1:55567: wsasend:

How do I prevent oom?

<!-- gh-comment-id:1989386645 --> @Bardo-Konrad commented on GitHub (Mar 11, 2024): I looked into it and yes, I found - CUDA error: out of memory - Error #01: write tcp 127.0.0.1:11434->127.0.0.1:55567: wsasend: How do I prevent oom?
Author
Owner

@jmorganca commented on GitHub (Mar 12, 2024):

Sorry about this, working on fixing it! Merging with #1952

<!-- gh-comment-id:1990933559 --> @jmorganca commented on GitHub (Mar 12, 2024): Sorry about this, working on fixing it! Merging with #1952
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1874