Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Linkwarden
  3. Auto-tagging ollama setup

Auto-tagging ollama setup

Scheduled Pinned Locked Moved Linkwarden
20 Posts 6 Posters 3.1k Views 7 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M mononym

    I used the command ifconfig | grep "inet " | grep -Fv 127.0.0.1 | awk '{print $2}' to get the address and used it to replace the value to http://###.##.##.###:11434. It gives me the same error as before: ECONNREFUSED

    I think the relevant log is this:

    Apr 02 16:38:04 [1] _currentUrl: 'http://###.##.##.###:11434/api/generate',
    Apr 02 16:38:04 [1] [Symbol(shapeMode)]: true,
    Apr 02 16:38:04 [1] [Symbol(kCapture)]: false
    Apr 02 16:38:04 [1] },
    Apr 02 16:38:04 [1] cause: Error: connect ECONNREFUSED ###.##.##.###:11434
    Apr 02 16:38:04 [1] at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1634:16)
    Apr 02 16:38:04 [1] at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
    Apr 02 16:38:04 [1] errno: -111,
    Apr 02 16:38:04 [1] code: 'ECONNREFUSED',
    Apr 02 16:38:04 [1] syscall: 'connect',
    Apr 02 16:38:04 [1] address: '###.##.##.###',
    Apr 02 16:38:04 [1] port: 11434
    Apr 02 16:38:04 [1] }
    Apr 02 16:38:04 [1] }
    Apr 02 16:38:07 [1] AxiosError: Request failed with status code 520
    Apr 02 16:38:07 [1] at settle (/app/code/node_modules/axios/lib/core/settle.js:19:12)
    Apr 02 16:38:07 [1] at IncomingMessage.handleStreamEnd (/app/code/node_modules/axios/lib/adapters/http.js:572:11)
    Apr 02 16:38:07 [1] at IncomingMessage.emit (node:events:530:35)
    Apr 02 16:38:07 [1] at IncomingMessage.emit (node:domain:489:12)
    Apr 02 16:38:07 [1] at endReadableNT (node:internal/streams/readable:1698:12)
    Apr 02 16:38:07 [1] at processTicksAndRejections (node:internal/process/task_queues:90:21) {
    Apr 02 16:38:07 [1] code: 'ERR_BAD_RESPONSE',
    
    robiR Offline
    robiR Offline
    robi
    wrote on last edited by
    #11

    @mononym We can't tell what address you found or used..

    Conscious tech

    1 Reply Last reply
    0
    • M mononym

      I used the command ifconfig | grep "inet " | grep -Fv 127.0.0.1 | awk '{print $2}' to get the address and used it to replace the value to http://###.##.##.###:11434. It gives me the same error as before: ECONNREFUSED

      I think the relevant log is this:

      Apr 02 16:38:04 [1] _currentUrl: 'http://###.##.##.###:11434/api/generate',
      Apr 02 16:38:04 [1] [Symbol(shapeMode)]: true,
      Apr 02 16:38:04 [1] [Symbol(kCapture)]: false
      Apr 02 16:38:04 [1] },
      Apr 02 16:38:04 [1] cause: Error: connect ECONNREFUSED ###.##.##.###:11434
      Apr 02 16:38:04 [1] at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1634:16)
      Apr 02 16:38:04 [1] at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
      Apr 02 16:38:04 [1] errno: -111,
      Apr 02 16:38:04 [1] code: 'ECONNREFUSED',
      Apr 02 16:38:04 [1] syscall: 'connect',
      Apr 02 16:38:04 [1] address: '###.##.##.###',
      Apr 02 16:38:04 [1] port: 11434
      Apr 02 16:38:04 [1] }
      Apr 02 16:38:04 [1] }
      Apr 02 16:38:07 [1] AxiosError: Request failed with status code 520
      Apr 02 16:38:07 [1] at settle (/app/code/node_modules/axios/lib/core/settle.js:19:12)
      Apr 02 16:38:07 [1] at IncomingMessage.handleStreamEnd (/app/code/node_modules/axios/lib/adapters/http.js:572:11)
      Apr 02 16:38:07 [1] at IncomingMessage.emit (node:events:530:35)
      Apr 02 16:38:07 [1] at IncomingMessage.emit (node:domain:489:12)
      Apr 02 16:38:07 [1] at endReadableNT (node:internal/streams/readable:1698:12)
      Apr 02 16:38:07 [1] at processTicksAndRejections (node:internal/process/task_queues:90:21) {
      Apr 02 16:38:07 [1] code: 'ERR_BAD_RESPONSE',
      
      necrevistonnezrN Offline
      necrevistonnezrN Offline
      necrevistonnezr
      wrote on last edited by
      #12

      @mononym should be something like 172.18.16.199 (that‘s the IP from my Linkwarden container)

      1 Reply Last reply
      0
      • M Online
        M Online
        mononym
        wrote on last edited by
        #13

        Yes, Linkwarden is at 172.18.19.122 and Open WebUI is at 172.18.17.227. Isn't that where you can reach the Open WebUI running ollama? But in its admin pane, the values forl ollama are these:

        image.png

        1 Reply Last reply
        0
        • jamesJ james marked this topic as a regular topic on
        • girishG girish

          Should we bake in llama into the app package as an optional component? For the moment, if you have an external llama, you can set that variable in linkwarden to point to the external llama.

          M Online
          M Online
          mononym
          wrote on last edited by
          #14

          Hi @girish

          Just wondering if including a tiny LLM into the package is still on the table. Hoarding links gets quickly out of proportion and the search function of Linkwarden is not the best (in my case). Auto-tagging would be very helpful to find links saved ages ago.

          1 Reply Last reply
          1
          • girishG Do not disturb
            girishG Do not disturb
            girish
            Staff
            wrote on last edited by
            #15

            @mononym I have made a task to make ollama a standalone app. This way other apps can benefit too (and maybe we can remove it from packages like openwebui).

            M 1 Reply Last reply
            1
            • M Online
              M Online
              mononym
              wrote last edited by
              #16
              This post is deleted!
              1 Reply Last reply
              0
              • girishG girish

                @mononym I have made a task to make ollama a standalone app. This way other apps can benefit too (and maybe we can remove it from packages like openwebui).

                M Online
                M Online
                mononym
                wrote last edited by mononym
                #17

                @girish thanks for packaging Ollama ! I can confirm that it works for auto-tagging 🙂

                1. From the Ollama app terminal, get the IP with ifconfig | grep "inet " | grep -Fv 127.0.0.1 | awk '{print $2}'
                2. Add the IP to the variable NEXT_PUBLIC_OLLAMA_ENDPOINT_URL + :11434 in the Linwarden env file. Ex. http://172.18.16.199:11434
                3. Follow Ollama docs to pull a model and the follow Linkwarden docs https://docs.linkwarden.app/self-hosting/ai-worker and https://docs.linkwarden.app/Usage/ai-tagging to enable the feature.
                girishG 1 Reply Last reply
                4
                • M mononym

                  @girish thanks for packaging Ollama ! I can confirm that it works for auto-tagging 🙂

                  1. From the Ollama app terminal, get the IP with ifconfig | grep "inet " | grep -Fv 127.0.0.1 | awk '{print $2}'
                  2. Add the IP to the variable NEXT_PUBLIC_OLLAMA_ENDPOINT_URL + :11434 in the Linwarden env file. Ex. http://172.18.16.199:11434
                  3. Follow Ollama docs to pull a model and the follow Linkwarden docs https://docs.linkwarden.app/self-hosting/ai-worker and https://docs.linkwarden.app/Usage/ai-tagging to enable the feature.
                  girishG Do not disturb
                  girishG Do not disturb
                  girish
                  Staff
                  wrote last edited by
                  #18

                  @mononym Can you not use the public endpoint https://ollama.domain.com instead of using the internal one ?

                  M 1 Reply Last reply
                  1
                  • girishG girish

                    @mononym Can you not use the public endpoint https://ollama.domain.com instead of using the internal one ?

                    M Online
                    M Online
                    mononym
                    wrote last edited by
                    #19

                    @girish Maybe if there's a way to pass the bearer token. But I did not find a way to do so. When setting NEXT_PUBLIC_OLLAMA_ENDPOINT to http://ollama-api.domain.com I get the answer

                    Nov 03 18:17:40 [1] responseBody: '<html>\r\n' +
                    Nov 03 18:17:40 [1] '<head><title>401 Authorization Required</title></head>\r\n' +
                    Nov 03 18:17:40 [1] '<body>\r\n' +
                    Nov 03 18:17:40 [1] '<center><h1>401 Authorization Required</h1></center>\r\n' +
                    Nov 03 18:17:40 [1] '<hr><center>nginx/1.24.0 (Ubuntu)</center>\r\n' +
                    Nov 03 18:17:40 [1] '</body>\r\n' +
                    Nov 03 18:17:40 [1] '</html>\r\n',
                    
                    1 Reply Last reply
                    0
                    • nebulonN Away
                      nebulonN Away
                      nebulon
                      Staff
                      wrote last edited by nebulon
                      #20

                      Edit: I tried to get this working and at least got to the point where I linkwarden would talk to ollama using the token, when using the openai configuration as mentioned in https://docs.linkwarden.app/self-hosting/ai-worker#openai-compatible-provider

                      However while it will be able to authenticate, I wasn't able to get passed the point where it manages to select a model which has "tool" capabilities.

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • Bookmarks
                      • Search