AI on Cloudron
-
@LoudLemur It's gonna be a wild ride into the virtual future...
https://x.com/BrianRoemmele/status/1758495953399771300
wrote on Feb 17, 2024, 7:47 AM last edited by@marcusquinn Hey, thanks for that! This is so uplifting!
-
wrote on Feb 23, 2024, 10:06 PM last edited by
There is a leaderboard for language models here. You can see the context length they support and the required VRAM etc:
https://llm.extractum.io/list/?uncensored=1There is a review of "uncensored" language models here:
https://scifilogic.com/open-uncensored-llm-model/ -
-
wrote on Mar 3, 2024, 12:47 AM last edited by LoudLemur Mar 3, 2024, 4:05 PM
Nvidia release Chat with RTX -
https://www.nvidia.com/en-us/ai-on-rtx/chat-with-rtx-generative-ai/
35GB downloadTraining Tutorial:
https://vid.puffyan.us/watch?v=y9Gi3UgNA3E&t=79Which suggests non-Free:
https://customgpt.ai/
(Any Free alternatives?)Pro-Tip: You should have about 100GB clear for storage. The .zip is 35GB, this then needs to unzip to form the installer, which then goes on to install which then downloads (slowly) a large amount of dependencies. If you don't have space to spare, all the available models won't be installed properly.)
Also, some have been reporting problems when not installing to the default location. -
wrote on Mar 4, 2024, 4:45 PM last edited by
Claude 3 is now released.
Antrhopic's AI is now available in 3 tiers: Opus is its Big Brain, and available to paying customers. Sonnet is trades brains for speed and powers the tier available gratis. Haiku is the fastest of all.
-
Claude 3 is now released.
Antrhopic's AI is now available in 3 tiers: Opus is its Big Brain, and available to paying customers. Sonnet is trades brains for speed and powers the tier available gratis. Haiku is the fastest of all.
wrote on Mar 5, 2024, 9:58 AM last edited by necrevistonnezr Mar 5, 2024, 9:59 AM@LoudLemur said in AI on Cloudron:
Claude 3 is now released.
https://www.anthropic.com/news/claude-3-familyNot to the general public. It's "not available in my region" (Germany): https://support.anthropic.com/en/articles/8461763-where-can-i-access-claude-ai
-
wrote on Mar 8, 2024, 11:25 PM last edited by
Finetune / train a 70b language model at home! (on two 24GB GPUs)
https://www.answer.ai/posts/2024-03-06-fsdp-qlora.html -
Claude 3 is now released.
Antrhopic's AI is now available in 3 tiers: Opus is its Big Brain, and available to paying customers. Sonnet is trades brains for speed and powers the tier available gratis. Haiku is the fastest of all.
wrote on Mar 12, 2024, 11:38 PM last edited by@LoudLemur Opus apparently beating GPT-4 for coding. The larger token size is much needed.
-
@LoudLemur said in AI on Cloudron:
Claude 3 is now released.
https://www.anthropic.com/news/claude-3-familyNot to the general public. It's "not available in my region" (Germany): https://support.anthropic.com/en/articles/8461763-where-can-i-access-claude-ai
wrote on Mar 12, 2024, 11:38 PM last edited by@necrevistonnezr Maybe try ivpn.net
-
wrote on Mar 12, 2024, 11:39 PM last edited by
Now this is a use for AI. Emotional video...
-
wrote on Mar 13, 2024, 12:18 AM last edited by
Devin - from Cognition AI
https://www.cognition-labs.com/Over 13% code problems solved without assistance, dwarfing its competitors
https://x.com/cognition_labs/status/1767548763134964000?s=20
-
wrote on Mar 14, 2024, 12:36 AM last edited by
Figure 01 from 'Open'AI
https://x.com/BrianRoemmele/status/1767923602895319525?s=20intelligent, seeing robots
-
wrote on Mar 18, 2024, 12:00 AM last edited by LoudLemur Mar 18, 2024, 12:26 AM
Grok Weights
It is a dataset.
magnet link, file 320GB
https://academictorrents.com/details/5f96d43576e3d386c9ba65b883210a393b68210e -
wrote on Mar 19, 2024, 12:20 PM last edited by
Stability AI have released stable video 3d.
It runs on 8GB VRAM. Low poly
https://stability.ai/news/introducing-stable-video-3d?utm_source=Twitter&utm_medium=website&utm_campaign=blog
https://huggingface.co/stabilityai/sv3d -
wrote on Mar 19, 2024, 11:49 PM last edited by
Anthropic announced Claude 3 with their premiere language model will in a couple of weeks be available on M$FT's Vertex. There is a cost estimator tool there. If you are just using the AI for personal use, say 20 times a day, the cost per month is tiny. They also are offering a sweetener of US$300 credit as a sweetener. You need to link a Google account. It is all non-Free and I doubt there is much privacy.
https://www.anthropic.com/news/google-vertex-general-availability
-
wrote on Mar 25, 2024, 10:15 AM last edited by
OpenEvidence, the non-Free, Medical AI, now is offering its API, for a price. If you give your details, you can try asking it about an ailment:
-
wrote on Mar 25, 2024, 5:25 PM last edited by
Just in case you guys have missed it GROK is now OPEN SOURCE!
Grok-1 is a 314B (that's billions of) parameter Mixture-of-Experts model, trained from scratch, and is now open source!
https://twitter.com/elonmusk/status/1767108624038449405
https://www.producthunt.com/posts/grok-1
https://www.producthunt.com/products/grok-ai-assistant -
wrote on Mar 25, 2024, 6:22 PM last edited by
-
Just in case you guys have missed it GROK is now OPEN SOURCE!
Grok-1 is a 314B (that's billions of) parameter Mixture-of-Experts model, trained from scratch, and is now open source!
https://twitter.com/elonmusk/status/1767108624038449405
https://www.producthunt.com/posts/grok-1
https://www.producthunt.com/products/grok-ai-assistantwrote on Mar 25, 2024, 8:29 PM last edited by marcusquinn Mar 25, 2024, 8:29 PM@micmc Open-Sources in AIs is a misnomer, given the models are compiled.
Still the right direction, but open-source could be used as a false sense of security given the compute power to compile the models is in the hands of very few.
-
@micmc Open-Sources in AIs is a misnomer, given the models are compiled.
Still the right direction, but open-source could be used as a false sense of security given the compute power to compile the models is in the hands of very few.
wrote on Mar 25, 2024, 8:45 PM last edited by@marcusquinn said in AI on Cloudron:
@micmc Open-Sources in AIs is a misnomer, given the models are compiled.
Still the right direction, but open-source could be used as a false sense of security given the compute power to compile the models is in the hands of very few.
Yeah, I sure get it, and in that sense you're right. I think, the idea of open source in language models is that you're able to download and privately host and train the model with your own data, to which, theoretically speaking no one else has access. All binary that we cannot examine the code cannot really called 'open source' in the real sense of the term.
-
wrote on Mar 26, 2024, 2:24 PM last edited by
Rabbit
https://rabbit.techAI Operating System device