AI on Cloudron
-
@LoudLemur Yes, this is another game changer
-
@LoudLemur It's gonna be a wild ride into the virtual future...
-
@marcusquinn Hey, thanks for that! This is so uplifting!
-
There is a leaderboard for language models here. You can see the context length they support and the required VRAM etc:
https://llm.extractum.io/list/?uncensored=1There is a review of "uncensored" language models here:
https://scifilogic.com/open-uncensored-llm-model/ -
Nvidia release Chat with RTX -
https://www.nvidia.com/en-us/ai-on-rtx/chat-with-rtx-generative-ai/
35GB downloadTraining Tutorial:
https://vid.puffyan.us/watch?v=y9Gi3UgNA3E&t=79Which suggests non-Free:
https://customgpt.ai/
(Any Free alternatives?)Pro-Tip: You should have about 100GB clear for storage. The .zip is 35GB, this then needs to unzip to form the installer, which then goes on to install which then downloads (slowly) a large amount of dependencies. If you don't have space to spare, all the available models won't be installed properly.)
Also, some have been reporting problems when not installing to the default location. -
Claude 3 is now released.
Antrhopic's AI is now available in 3 tiers: Opus is its Big Brain, and available to paying customers. Sonnet is trades brains for speed and powers the tier available gratis. Haiku is the fastest of all.
-
@LoudLemur said in AI on Cloudron:
Claude 3 is now released.
https://www.anthropic.com/news/claude-3-familyNot to the general public. It's "not available in my region" (Germany): https://support.anthropic.com/en/articles/8461763-where-can-i-access-claude-ai
-
Finetune / train a 70b language model at home! (on two 24GB GPUs)
https://www.answer.ai/posts/2024-03-06-fsdp-qlora.html -
@LoudLemur Opus apparently beating GPT-4 for coding. The larger token size is much needed.
-
@necrevistonnezr Maybe try ivpn.net
-
Now this is a use for AI. Emotional video...
-
Devin - from Cognition AI
https://www.cognition-labs.com/Over 13% code problems solved without assistance, dwarfing its competitors
https://x.com/cognition_labs/status/1767548763134964000?s=20
-
Figure 01 from 'Open'AI
https://x.com/BrianRoemmele/status/1767923602895319525?s=20intelligent, seeing robots
-
Grok Weights
It is a dataset.
magnet link, file 320GB
https://academictorrents.com/details/5f96d43576e3d386c9ba65b883210a393b68210e -
Stability AI have released stable video 3d.
It runs on 8GB VRAM. Low poly
https://stability.ai/news/introducing-stable-video-3d?utm_source=Twitter&utm_medium=website&utm_campaign=blog
https://huggingface.co/stabilityai/sv3d -
Anthropic announced Claude 3 with their premiere language model will in a couple of weeks be available on M$FT's Vertex. There is a cost estimator tool there. If you are just using the AI for personal use, say 20 times a day, the cost per month is tiny. They also are offering a sweetener of US$300 credit as a sweetener. You need to link a Google account. It is all non-Free and I doubt there is much privacy.
https://www.anthropic.com/news/google-vertex-general-availability
-
OpenEvidence, the non-Free, Medical AI, now is offering its API, for a price. If you give your details, you can try asking it about an ailment:
-
Just in case you guys have missed it GROK is now OPEN SOURCE!
Grok-1 is a 314B (that's billions of) parameter Mixture-of-Experts model, trained from scratch, and is now open source!
https://twitter.com/elonmusk/status/1767108624038449405
https://www.producthunt.com/posts/grok-1
https://www.producthunt.com/products/grok-ai-assistant -
-
@micmc Open-Sources in AIs is a misnomer, given the models are compiled.
Still the right direction, but open-source could be used as a false sense of security given the compute power to compile the models is in the hands of very few.