
LLM Farm
Artem Savkin
4.3
Advertisement
リリース詳細
発行国 | US |
国のリリース日 | 2023-12-13 |
カテゴリ | Developer Tools |
国/地域 | US |
開発者ウェブサイト | Artem Savkin |
サポートURL | Artem Savkin |
コンテンツ評価 | 4+ |
Advertisement
LLMFarm is an iOS and MacOS app to work with large language models (LLM). It allows you to load different LLMs with certain parameters.
# Features
* Various inferences
* Various sampling methods
* Metal
* Model setting templates
* LoRA adapters support
* LoRA FineTune and Export
# Inferences
* LLaMA
* GPTNeoX
* Replit
* GPT2 + Cerebras
* Starcoder(Santacoder)
* RWKV (20B tokenizer)
* Falcon
* MPT
* Bloom
* StableLM-3b-4e1t
* Qwen
* Gemma
* Phi
* Mamba
* Others
# Multimodal
* LLaVA 1.5 models
* Obsidian
* MobileVLM 1.7B/3B models
Note: For Falcon, Alpaca, GPT4All, Chinese LLaMA / Alpaca and Chinese LLaMA-2 / Alpaca-2, Vigogne (French), Vicuna, Koala, OpenBuddy (Multilingual), Pygmalion/Metharme, WizardLM, Baichuan 1 & 2 + derivations, Aquila 1 & 2, Mistral AI v0.1, Refact, Persimmon 8B, MPT, Bloom select llama inferece in model settings.
Sampling methods
* Temperature (temp, tok-k, top-p)
* Tail Free Sampling (TFS)
* Locally Typical Sampling
* Mirostat
* Greedy
* Grammar (dont work with GGJTv3)
Download LLM Farm

Not Available
平均評価
24
評価の内訳
注目のレビュー
hackerlocal.usによる
2024-06-04
バージョン
Customer ServiceLove itThe app need Markdown for LLM responses. Often time the LLM will output in markdown but the app doesn’t parse it. Other then that, this is the best LLM app for mobile! It has a lot of customizable settings and it supports a lot of quantized models. It also supports all the shortcuts. What other open source LLM app can compete with this? Its developer is active and constantly trying to improved the app. I love this app.
takemusuaikiによる
2024-04-06
バージョン
Love itDeviceNetworkLove this app. Surprisingly powerful and has tons of tweaking options that many apps on full computers lack. I use it on phone and iPad. I would love to see support for newer Gemma models and the ability act as an inference server like ollama or LM studio so I can run local inference for Obsidian or other apps.
Feeling defraudedによる
2024-02-21
バージョン
This app looks pretty promising, but it’s a little bit daunting to someone who’s not as familiar with setting up LLMs. For example, how do you download the LLMs and where do you go to get them? Which alarms are likely to work? It might be a good idea to include specific LLMs that have been tested on which devices. Some sort of tutorial or instructions would be really useful.
スクリーンショット



