Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
|
1 year ago | |
|---|---|---|
| .. | ||
| Async | 1 year ago | |
| Common | 1 year ago | |
| Hubs | 1 year ago | |
| Models | 1 year ago | |
| Pages | 1 year ago | |
| Services | 1 year ago | |
| wwwroot | 1 year ago | |
| Extensions.cs | 1 year ago | |
| LLama.Web.csproj | 1 year ago | |
| Program.cs | 1 year ago | |
| README.md | 1 year ago | |
| appsettings.Development.json | 2 years ago | |
| appsettings.json | 1 year ago | |
| libman.json | 1 year ago | |
LLama.Web has no heavy dependencies and no extra frameworks over bootstrap, jquery and mustache to keep the examples clean and easy to copy over to your own project
Using signalr websockets simplifies the streaming of responses and model per connection management
Models, Prompts and Inference parameters can be added to appsettings.json.
If you would like to add your own local model files then it's best to create an appSettings.Local.json file
and add them there. The appSettings.Local.json file will be ignored by Git.
Models
You can add multiple models to the options for quick selection in the UI, options are based on ModelParams so its fully configurable.
Parameters
You can add multiple sets of inference parameters to the options for quick selection in the UI, options are based on InferenceParams so its fully configurable
Prompts
You can add multiple sets of prompts to the options for quick selection in the UI
Example:
{
"Name": "Alpaca",
"Path": "D:\\Repositories\\AI\\Prompts\\alpaca.txt",
"Prompt": "Alternatively to can set a prompt text directly and omit the Path"
"AntiPrompt": [
"User:"
],
"OutputFilter": [
"Response:",
"User:"
]
}
The interactive UI is a simple example of using LLamaSharp.
C#/.NET上易用的LLM高性能推理框架,支持LLaMA和LLaVA系列模型。
C# Text Metal JavaScript HTML+Razor other