Compare commits

...

48 Commits

Author SHA1 Message Date
Patrick Devine
a0ae700d5d add llama2:13b model to the readme 2023-07-19 08:15:56 -07:00
Michael Chiang
0294216ea9 Merge pull request #124 from DavidZirinsky/patch-1
Update README.md
2023-07-19 07:40:24 -07:00
Eva Ho
f08c050e57 fix page transitions flickering 2023-07-19 10:19:24 -04:00
DavidZirinsky
ffcd90e8a7 Update README.md
I needed to do this to run the project
2023-07-19 08:14:44 -06:00
Jeffrey Morgan
4ca7c4be1f dont consume reader when calculating digest 2023-07-19 00:47:55 -07:00
Michael Chiang
17b7af78f0 Merge pull request #115 from jmorganca/Add-wizard-vicuna-uncensored-model-link
Add wizard vicuna uncensored model link
2023-07-18 22:58:07 -07:00
Jeffrey Morgan
4c1dc52083 app: create /usr/local/bin/ if it does not exist 2023-07-18 22:50:52 -07:00
Patrick Devine
572fc9099f add license layers to the parser (#116) 2023-07-18 22:49:38 -07:00
Michael Chiang
3020f29041 Add wizard vicuna uncensored model link 2023-07-18 22:19:12 -07:00
Michael Yang
a6d03dd510 Merge pull request #110 from jmorganca/fix-pull-0-bytes
fix pull 0 bytes on completed layer
2023-07-18 19:38:59 -07:00
Michael Yang
68df36ae50 fix pull 0 bytes on completed layer 2023-07-18 19:38:11 -07:00
Michael Yang
5540305293 Merge pull request #112 from jmorganca/fix-relative-modelfile
resolve modelfile before passing to server
2023-07-18 19:36:24 -07:00
Michael Yang
d4cfee79d5 resolve modelfile before passing to server 2023-07-18 19:34:05 -07:00
Michael Yang
6e36f948df Merge pull request #109 from jmorganca/fix-create-memory
fix memory leak in create
2023-07-18 17:25:19 -07:00
Michael Yang
553fa39fe8 fix memory leak in create 2023-07-18 17:14:17 -07:00
Jeffrey Morgan
820e581ad8 web: fix typos and add link to discord 2023-07-18 17:03:40 -07:00
Isaac McFadyen
d14785738e README typo fix (#106)
* Fixed typo in README
2023-07-18 16:24:57 -07:00
Patrick Devine
9e15635c2d attempt two for skipping files in the file walk (#105) 2023-07-18 15:37:01 -07:00
Jeffrey Morgan
3e10f902f5 add mario example 2023-07-18 14:27:36 -07:00
Jeffrey Morgan
aa6714f25c fix typo in README.md 2023-07-18 14:03:11 -07:00
Jeffrey Morgan
7f3a37aed4 fix typo 2023-07-18 13:32:06 -07:00
Jeffrey Morgan
7b08280355 move download to the top of README.md 2023-07-18 13:31:25 -07:00
Jeffrey Morgan
e3cc4d5eac update README.md with new syntax 2023-07-18 13:22:46 -07:00
Jeffrey Morgan
8c85dfb735 Add README.md for examples 2023-07-18 13:22:46 -07:00
hoyyeva
ac62a413e5 Merge pull request #103 from jmorganca/web-update
website content and design update
2023-07-18 16:18:04 -04:00
Eva Ho
d1f89778e9 fix css on smaller screen 2023-07-18 16:17:42 -04:00
Eva Ho
df67a90e64 fix css 2023-07-18 16:02:45 -04:00
Eva Ho
576ae644de enable downloader 2023-07-18 15:57:39 -04:00
Eva Ho
7e52e51db1 update website text and design 2023-07-18 15:56:43 -04:00
Michael Chiang
f12df8d79a Merge pull request #101 from jmorganca/adding-logo
add logo
2023-07-18 12:47:20 -07:00
Michael Chiang
65de730bdb Update README.md
add logo
2023-07-18 12:45:38 -07:00
Patrick Devine
9658a5043b skip files in the list if we can't get the correct model path (#100) 2023-07-18 12:39:08 -07:00
Jeffrey Morgan
280fbe8019 app: use llama2 instead of orca 2023-07-18 12:36:03 -07:00
Jeffrey Morgan
2e339c2bab flatten examples 2023-07-18 12:33:50 -07:00
Michael Yang
38f0c54c64 Merge pull request #99 from jmorganca/mkdir-blobs
fix mkdir blob path
2023-07-18 11:29:05 -07:00
Michael Yang
f20426a768 fix mkdir blob path 2023-07-18 11:24:19 -07:00
Michael Yang
885f67a471 Merge pull request #92 from jmorganca/create-model-spinner
Create model spinner
2023-07-18 11:15:45 -07:00
Eva Ho
a9cc270b4d icon update 2023-07-18 13:33:26 -04:00
Eva Ho
aa281a30e5 updating icons 2023-07-18 13:33:26 -04:00
Matt Williams
760bc3366b Merge pull request #98 from jmorganca/matt/modelfiledoc
First stab at a modelfile doc
2023-07-18 09:16:01 -07:00
Patrick Devine
5bea29f610 add new list command (#97) 2023-07-18 09:09:45 -07:00
Matt Williams
9310ee3967 First stab at a modelfile doc
Signed-off-by: Matt Williams <m@technovangelist.com>
2023-07-18 08:22:17 -07:00
Matt Williams
da7ddbb4dc Merge pull request #95 from jmorganca/matt/examplemodelfiles 2023-07-18 05:32:38 -07:00
Patrick Devine
4a28a2f093 add modelpaths (#96) 2023-07-17 22:44:21 -07:00
Matt Williams
3d9498dc95 Some simple modelfile examples
Signed-off-by: Matt Williams <m@technovangelist.com>
2023-07-17 17:16:59 -07:00
Jeffrey Morgan
1f45f7bb52 convert commands to uppercase in parser 2023-07-17 15:34:08 -07:00
Michael Yang
e4300e1eb7 add spinner to create 2023-07-17 14:15:42 -07:00
Michael Yang
aba706ea2d remove unused persistent pre run 2023-07-17 14:14:57 -07:00
32 changed files with 1050 additions and 387 deletions

113
README.md
View File

@@ -1,75 +1,68 @@
![ollama](https://github.com/jmorganca/ollama/assets/251292/961f99bb-251a-4eec-897d-1ba99997ad0f)
<div align="center">
<picture>
<source media="(prefers-color-scheme: dark)" height="200px" srcset="https://github.com/jmorganca/ollama/assets/3325447/318048d2-b2dd-459c-925a-ac8449d5f02c">
<img alt="logo" height="200px" src="https://github.com/jmorganca/ollama/assets/3325447/c7d6e15f-7f4d-4776-b568-c084afa297c2">
</picture>
</div>
# Ollama
Run large language models with `llama.cpp`.
Create, run, and share self-contained large language models (LLMs). Ollama bundles a models weights, configuration, prompts, and more into self-contained packages that run anywhere.
> Note: certain models that can be run with Ollama are intended for research and/or non-commercial use only.
> Note: Ollama is in early preview. Please report any issues you find.
### Features
## Download
- Download and run popular large language models
- Switch between multiple models on the fly
- Hardware acceleration where available (Metal, CUDA)
- Fast inference server written in Go, powered by [llama.cpp](https://github.com/ggerganov/llama.cpp)
- REST API to use with your application (python, typescript SDKs coming soon)
- [Download](https://ollama.ai/download) for macOS on Apple Silicon (Intel coming soon)
- Download for Windows and Linux (coming soon)
- Build [from source](#building)
## Install
## Examples
- [Download](https://ollama.ai/download) for macOS with Apple Silicon (Intel coming soon)
- Download for Windows (coming soon)
You can also build the [binary from source](#building).
## Quickstart
Run a fast and simple model.
### Quickstart
```
ollama run orca
ollama run llama2
>>> hi
Hello! How can I help you today?
```
## Example models
### Creating a custom model
### 💬 Chat
Have a conversation.
Create a `Modelfile`:
```
ollama run vicuna "Why is the sky blue?"
FROM llama2
PROMPT """
You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.
User: {{ .Prompt }}
Mario:
"""
```
### 🗺️ Instructions
Get a helping hand.
Next, create and run the model:
```
ollama run orca "Write an email to my boss."
ollama create mario -f ./Modelfile
ollama run mario
>>> hi
Hello! It's your friend Mario.
```
### 🔎 Ask questions about documents
## Model library
Send the contents of a document and ask questions about it.
Ollama includes a library of open-source, pre-trained models. More models are coming soon. You should have at least 8 GB of RAM to run the 3B models, 16 GB
to run the 7B models, and 32 GB to run the 13B models.
```
ollama run nous-hermes "$(cat input.txt)", please summarize this story
```
### 📖 Storytelling
Venture into the unknown.
```
ollama run nous-hermes "Once upon a time"
```
## Advanced usage
### Run a local model
```
ollama run ~/Downloads/vicuna-7b-v1.3.ggmlv3.q4_1.bin
```
| Model | Parameters | Size | Download |
| ---------------------- | ---------- | ----- | --------------------------- |
| Llama2 | 7B | 3.8GB | `ollama pull llama2` |
| Llama2 13B | 13B | 7.3GB | `ollama pull llama2:13b` |
| Orca Mini | 3B | 1.9GB | `ollama pull orca` |
| Vicuna | 7B | 3.8GB | `ollama pull vicuna` |
| Nous-Hermes | 13B | 7.3GB | `ollama pull nous-hermes` |
| Wizard Vicuna Uncensored | 13B | 7.3GB | `ollama pull wizard-vicuna` |
## Building
@@ -80,29 +73,11 @@ go build .
To run it start the server:
```
./ollama server &
./ollama serve &
```
Finally, run a model!
```
./ollama run ~/Downloads/vicuna-7b-v1.3.ggmlv3.q4_1.bin
```
## API Reference
### `POST /api/pull`
Download a model
```
curl -X POST http://localhost:11343/api/pull -d '{"model": "orca"}'
```
### `POST /api/generate`
Complete a prompt
```
curl -X POST http://localhost:11434/api/generate -d '{"model": "orca", "prompt": "hello!"}'
./ollama run llama2
```

View File

@@ -6,26 +6,31 @@ import (
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"net/url"
)
type StatusError struct {
StatusCode int
Status string
Message string
type Client struct {
base url.URL
HTTP http.Client
Headers http.Header
}
func (e StatusError) Error() string {
if e.Message != "" {
return fmt.Sprintf("%s: %s", e.Status, e.Message)
func checkError(resp *http.Response, body []byte) error {
if resp.StatusCode >= 200 && resp.StatusCode < 400 {
return nil
}
return e.Status
}
apiError := StatusError{StatusCode: resp.StatusCode}
type Client struct {
base url.URL
err := json.Unmarshal(body, &apiError)
if err != nil {
// Use the full body as the message if we fail to decode a response.
apiError.Message = string(body)
}
return apiError
}
func NewClient(hosts ...string) *Client {
@@ -36,9 +41,60 @@ func NewClient(hosts ...string) *Client {
return &Client{
base: url.URL{Scheme: "http", Host: host},
HTTP: http.Client{},
}
}
func (c *Client) do(ctx context.Context, method, path string, reqData, respData any) error {
var reqBody io.Reader
var data []byte
var err error
if reqData != nil {
data, err = json.Marshal(reqData)
if err != nil {
return err
}
reqBody = bytes.NewReader(data)
}
url := c.base.JoinPath(path).String()
req, err := http.NewRequestWithContext(ctx, method, url, reqBody)
if err != nil {
return err
}
req.Header.Set("Content-Type", "application/json")
req.Header.Set("Accept", "application/json")
for k, v := range c.Headers {
req.Header[k] = v
}
respObj, err := c.HTTP.Do(req)
if err != nil {
return err
}
defer respObj.Body.Close()
respBody, err := io.ReadAll(respObj.Body)
if err != nil {
return err
}
if err := checkError(respObj, respBody); err != nil {
return err
}
if len(respBody) > 0 && respData != nil {
if err := json.Unmarshal(respBody, respData); err != nil {
return err
}
}
return nil
}
func (c *Client) stream(ctx context.Context, method, path string, data any, fn func([]byte) error) error {
var buf *bytes.Buffer
if data != nil {
@@ -104,11 +160,11 @@ func (c *Client) Generate(ctx context.Context, req *GenerateRequest, fn Generate
})
}
type PullProgressFunc func(PullProgress) error
type PullProgressFunc func(ProgressResponse) error
func (c *Client) Pull(ctx context.Context, req *PullRequest, fn PullProgressFunc) error {
return c.stream(ctx, http.MethodPost, "/api/pull", req, func(bts []byte) error {
var resp PullProgress
var resp ProgressResponse
if err := json.Unmarshal(bts, &resp); err != nil {
return err
}
@@ -117,11 +173,11 @@ func (c *Client) Pull(ctx context.Context, req *PullRequest, fn PullProgressFunc
})
}
type PushProgressFunc func(PushProgress) error
type PushProgressFunc func(ProgressResponse) error
func (c *Client) Push(ctx context.Context, req *PushRequest, fn PushProgressFunc) error {
return c.stream(ctx, http.MethodPost, "/api/push", req, func(bts []byte) error {
var resp PushProgress
var resp ProgressResponse
if err := json.Unmarshal(bts, &resp); err != nil {
return err
}
@@ -142,3 +198,11 @@ func (c *Client) Create(ctx context.Context, req *CreateRequest, fn CreateProgre
return fn(resp)
})
}
func (c *Client) List(ctx context.Context) (*ListResponse, error) {
var lr ListResponse
if err := c.do(ctx, http.MethodGet, "/api/tags", nil, &lr); err != nil {
return nil, err
}
return &lr, nil
}

View File

@@ -7,6 +7,19 @@ import (
"time"
)
type StatusError struct {
StatusCode int
Status string
Message string
}
func (e StatusError) Error() string {
if e.Message != "" {
return fmt.Sprintf("%s: %s", e.Status, e.Message)
}
return e.Status
}
type GenerateRequest struct {
Model string `json:"model"`
Prompt string `json:"prompt"`
@@ -30,12 +43,11 @@ type PullRequest struct {
Password string `json:"password"`
}
type PullProgress struct {
type ProgressResponse struct {
Status string `json:"status"`
Digest string `json:"digest,omitempty"`
Total int `json:"total,omitempty"`
Completed int `json:"completed,omitempty"`
Percent float64 `json:"percent,omitempty"`
}
type PushRequest struct {
@@ -44,12 +56,14 @@ type PushRequest struct {
Password string `json:"password"`
}
type PushProgress struct {
Status string `json:"status"`
Digest string `json:"digest,omitempty"`
Total int `json:"total,omitempty"`
Completed int `json:"completed,omitempty"`
Percent float64 `json:"percent,omitempty"`
type ListResponse struct {
Models []ListResponseModel `json:"models"`
}
type ListResponseModel struct {
Name string `json:"name"`
ModifiedAt time.Time `json:"modified_at"`
Size int `json:"size"`
}
type GenerateResponse struct {

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 442 B

After

Width:  |  Height:  |  Size: 403 B

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 889 B

After

Width:  |  Height:  |  Size: 741 B

View File

@@ -1,4 +1,4 @@
import type { ForgeConfig, ResolvedForgeConfig, ForgeMakeResult } from '@electron-forge/shared-types'
import type { ForgeConfig } from '@electron-forge/shared-types'
import { MakerSquirrel } from '@electron-forge/maker-squirrel'
import { MakerZIP } from '@electron-forge/maker-zip'
import { PublisherGithub } from '@electron-forge/publisher-github'

View File

@@ -19,7 +19,7 @@ export default function () {
const [step, setStep] = useState<Step>(Step.WELCOME)
const [commandCopied, setCommandCopied] = useState<boolean>(false)
const command = 'ollama run orca'
const command = 'ollama run llama2'
return (
<div className='drag'>
@@ -77,7 +77,11 @@ export default function () {
{command}
</pre>
<button
className={`no-drag absolute right-[5px] px-2 py-2 ${commandCopied ? 'text-gray-900 opacity-100 hover:cursor-auto' : 'text-gray-200 opacity-50 hover:cursor-pointer'} hover:text-gray-900 hover:font-bold group-hover:opacity-100`}
className={`no-drag absolute right-[5px] px-2 py-2 ${
commandCopied
? 'text-gray-900 opacity-100 hover:cursor-auto'
: 'text-gray-200 opacity-50 hover:cursor-pointer'
} hover:font-bold hover:text-gray-900 group-hover:opacity-100`}
onClick={() => {
copy(command)
setCommandCopied(true)
@@ -85,13 +89,15 @@ export default function () {
}}
>
{commandCopied ? (
<CheckIcon className='h-4 w-4 text-gray-500 font-bold' />
<CheckIcon className='h-4 w-4 font-bold text-gray-500' />
) : (
<DocumentDuplicateIcon className='h-4 w-4 text-gray-500' />
)}
</button>
</div>
<p className='mx-auto my-4 w-[70%] text-xs text-gray-400'>Run this command in your favorite terminal.</p>
<p className='mx-auto my-4 w-[70%] text-xs text-gray-400'>
Run this command in your favorite terminal.
</p>
</div>
<button
onClick={() => {

View File

@@ -162,7 +162,6 @@ app.on('ready', () => {
// This is the first run or the CLI is no longer installed
app.setLoginItemSettings({ openAtLogin: true })
firstRunWindow()
})

View File

@@ -13,7 +13,9 @@ export function installed() {
}
export async function install() {
const command = `do shell script "ln -F -s ${ollama} ${symlinkPath}" with administrator privileges`
const command = `do shell script "mkdir -p ${path.dirname(
symlinkPath
)} && ln -F -s ${ollama} ${symlinkPath}" with administrator privileges`
try {
await exec(`osascript -e '${command}'`)

View File

@@ -13,30 +13,37 @@ import (
"strings"
"time"
"github.com/dustin/go-humanize"
"github.com/olekukonko/tablewriter"
"github.com/schollz/progressbar/v3"
"github.com/spf13/cobra"
"golang.org/x/term"
"github.com/jmorganca/ollama/api"
"github.com/jmorganca/ollama/format"
"github.com/jmorganca/ollama/server"
)
func cacheDir() string {
home, err := os.UserHomeDir()
if err != nil {
panic(err)
}
return filepath.Join(home, ".ollama")
}
func create(cmd *cobra.Command, args []string) error {
filename, _ := cmd.Flags().GetString("file")
filename, err := filepath.Abs(filename)
if err != nil {
return err
}
client := api.NewClient()
var spinner *Spinner
request := api.CreateRequest{Name: args[0], Path: filename}
fn := func(resp api.CreateProgress) error {
fmt.Println(resp.Status)
if spinner != nil {
spinner.Stop()
}
spinner = NewSpinner(resp.Status)
go spinner.Spin(100 * time.Millisecond)
return nil
}
@@ -44,11 +51,21 @@ func create(cmd *cobra.Command, args []string) error {
return err
}
if spinner != nil {
spinner.Stop()
}
return nil
}
func RunRun(cmd *cobra.Command, args []string) error {
_, err := os.Stat(args[0])
mp := server.ParseModelPath(args[0])
fp, err := mp.GetManifestPath(false)
if err != nil {
return err
}
_, err = os.Stat(fp)
switch {
case errors.Is(err, os.ErrNotExist):
if err := pull(args[0]); err != nil {
@@ -72,7 +89,7 @@ func push(cmd *cobra.Command, args []string) error {
client := api.NewClient()
request := api.PushRequest{Name: args[0]}
fn := func(resp api.PushProgress) error {
fn := func(resp api.ProgressResponse) error {
fmt.Println(resp.Status)
return nil
}
@@ -83,6 +100,34 @@ func push(cmd *cobra.Command, args []string) error {
return nil
}
func list(cmd *cobra.Command, args []string) error {
client := api.NewClient()
models, err := client.List(context.Background())
if err != nil {
return err
}
var data [][]string
for _, m := range models.Models {
data = append(data, []string{m.Name, humanize.Bytes(uint64(m.Size)), format.HumanTime(m.ModifiedAt, "Never")})
}
table := tablewriter.NewWriter(os.Stdout)
table.SetHeader([]string{"NAME", "SIZE", "MODIFIED"})
table.SetHeaderAlignment(tablewriter.ALIGN_LEFT)
table.SetAlignment(tablewriter.ALIGN_LEFT)
table.SetHeaderLine(false)
table.SetBorder(false)
table.SetNoWhiteSpace(true)
table.SetTablePadding("\t")
table.AppendBulk(data)
table.Render()
return nil
}
func RunPull(cmd *cobra.Command, args []string) error {
return pull(args[0])
}
@@ -90,25 +135,23 @@ func RunPull(cmd *cobra.Command, args []string) error {
func pull(model string) error {
client := api.NewClient()
var currentDigest string
var bar *progressbar.ProgressBar
currentLayer := ""
request := api.PullRequest{Name: model}
fn := func(resp api.PullProgress) error {
if resp.Digest != currentLayer && resp.Digest != "" {
if currentLayer != "" {
fmt.Println()
}
currentLayer = resp.Digest
layerStr := resp.Digest[7:23] + "..."
fn := func(resp api.ProgressResponse) error {
if resp.Digest != currentDigest && resp.Digest != "" {
currentDigest = resp.Digest
bar = progressbar.DefaultBytes(
int64(resp.Total),
"pulling "+layerStr,
fmt.Sprintf("pulling %s...", resp.Digest[7:19]),
)
} else if resp.Digest == currentLayer && resp.Digest != "" {
bar.Set(resp.Completed)
} else if resp.Digest == currentDigest && resp.Digest != "" {
bar.Set(resp.Completed)
} else {
currentLayer = ""
currentDigest = ""
fmt.Println(resp.Status)
}
return nil
@@ -139,24 +182,8 @@ func generate(cmd *cobra.Command, model, prompt string) error {
if len(strings.TrimSpace(prompt)) > 0 {
client := api.NewClient()
spinner := progressbar.NewOptions(-1,
progressbar.OptionSetWriter(os.Stderr),
progressbar.OptionThrottle(60*time.Millisecond),
progressbar.OptionSpinnerType(14),
progressbar.OptionSetRenderBlankState(true),
progressbar.OptionSetElapsedTime(false),
progressbar.OptionClearOnFinish(),
)
go func() {
for range time.Tick(60 * time.Millisecond) {
if spinner.IsFinished() {
break
}
spinner.Add(1)
}
}()
spinner := NewSpinner("")
go spinner.Spin(60 * time.Millisecond)
var latest api.GenerateResponse
@@ -255,10 +282,6 @@ func NewCLI() *cobra.Command {
CompletionOptions: cobra.CompletionOptions{
DisableDefaultCmd: true,
},
PersistentPreRunE: func(_ *cobra.Command, args []string) error {
// create the models directory and it's parent
return os.MkdirAll(filepath.Join(cacheDir(), "models"), 0o700)
},
}
cobra.EnableCommandSorting = false
@@ -302,12 +325,19 @@ func NewCLI() *cobra.Command {
RunE: push,
}
listCmd := &cobra.Command{
Use: "list",
Short: "List models",
RunE: list,
}
rootCmd.AddCommand(
serveCmd,
createCmd,
runCmd,
pullCmd,
pushCmd,
listCmd,
)
return rootCmd

44
cmd/spinner.go Normal file
View File

@@ -0,0 +1,44 @@
package cmd
import (
"fmt"
"os"
"time"
"github.com/schollz/progressbar/v3"
)
type Spinner struct {
description string
*progressbar.ProgressBar
}
func NewSpinner(description string) *Spinner {
return &Spinner{
description: description,
ProgressBar: progressbar.NewOptions(-1,
progressbar.OptionSetWriter(os.Stderr),
progressbar.OptionThrottle(60*time.Millisecond),
progressbar.OptionSpinnerType(14),
progressbar.OptionSetRenderBlankState(true),
progressbar.OptionSetElapsedTime(false),
progressbar.OptionClearOnFinish(),
progressbar.OptionSetDescription(description),
),
}
}
func (s *Spinner) Spin(tick time.Duration) {
for range time.Tick(tick) {
if s.IsFinished() {
break
}
s.Add(1)
}
}
func (s *Spinner) Stop() {
s.Finish()
fmt.Println(s.description)
}

View File

@@ -3,13 +3,13 @@
Install required tools:
```
brew install cmake go node
brew install go
```
Then run `make`:
Then build ollama:
```
make
go build .
```
Now you can run `ollama`:

80
docs/modelfile.md Normal file
View File

@@ -0,0 +1,80 @@
# Ollama Model File Reference
Ollama can build models automatically by reading the instructions from a Modelfile. A Modelfile is a text document that represents the complete configuration of the Model. You can see that a Modelfile is very similar to a Dockerfile.
## Format
Here is the format of the Modelfile:
```modelfile
# comment
INSTRUCTION arguments
```
Nothing in the file is case-sensitive. However, the convention is for instructions to be uppercase to make it easier to distinguish from the arguments.
A Modelfile can include instructions in any order. But the convention is to start the Modelfile with the FROM instruction.
Although the example above shows a comment starting with a hash character, any instruction that is not recognized is seen as a comment.
## FROM
```modelfile
FROM <image>[:<tag>]
```
This defines the base model to be used. An image can be a known image on the Ollama Hub, or a fully-qualified path to a model file on your system
## PARAMETER
The PARAMETER instruction defines a parameter that can be set when the model is run.
```modelfile
PARAMETER <parameter> <parametervalue>
```
### Valid Parameters and Values
| Parameter | Description | Value Type | Value Range |
| ---------------- | ------------------------------------------------------------------------------------------- | ---------- | ----------- |
| NumCtx | | int | |
| NumGPU | | int | |
| MainGPU | | int | |
| LowVRAM | | bool | |
| F16KV | | bool | |
| LogitsAll | | bool | |
| VocabOnly | | bool | |
| UseMMap | | bool | |
| EmbeddingOnly | | bool | |
| RepeatLastN | | int | |
| RepeatPenalty | | float | |
| FrequencyPenalty | | float | |
| PresencePenalty | | float | |
| temperature | The temperature of the model. Higher temperatures result in more creativity in the response | float | 0 - 1 |
| TopK | | int | |
| TopP | | float | |
| TFSZ | | float | |
| TypicalP | | float | |
| Mirostat | | int | |
| MirostatTau | | float | |
| MirostatEta | | float | |
| NumThread | | int | |
## PROMPT
Prompt is a multiline instruction that defines the prompt to be used when the model is run. Typically there are 3-4 components to a prompt: System, context, user, and response.
```modelfile
PROMPT """
{{- if not .Context }}
### System:
You are a content marketer who needs to come up with a short but succinct tweet. Make sure to include the appropriate hashtags and links. Sometimes when appropriate, describe a meme that can be includes as well. All answers should be in the form of a tweet which has a max size of 280 characters. Every instruction will be the topic to create a tweet about.
{{- end }}
### Instruction:
{{ .Prompt }}
### Response:
"""
```

15
examples/README.md Normal file
View File

@@ -0,0 +1,15 @@
# Examples
This directory contains examples that can be created and run with `ollama`.
To create a model:
```
ollama create example -f <example file>
```
To run a model:
```
ollama run example
```

7
examples/mario Normal file
View File

@@ -0,0 +1,7 @@
FROM llama2
PARAMETER temperature 1
PROMPT """
System: You are Mario from super mario bros, acting as an assistant.
User: {{ .Prompt }}
Assistant:
"""

View File

@@ -0,0 +1,14 @@
# Modelfile for creating a Midjourney prompts from a topic
# Run `ollama create mj -f pathtofile` and then `ollama run mj` and enter a topic
FROM library/nous-hermes:latest
PROMPT """
{{- if not .Context }}
### System:
Embrace your role as an AI-powered creative assistant, employing Midjourney to manifest compelling AI-generated art. I will outline a specific image concept, and in response, you must produce an exhaustive, multifaceted prompt for Midjourney, ensuring every detail of the original concept is represented in your instructions. Midjourney doesn't do well with text, so after the prompt, give me instructions that I can use to create the titles in a image editor.
{{- end }}
### Instruction:
{{ .Prompt }}
### Response:
"""

View File

@@ -1,15 +0,0 @@
# Python
This is a simple example of calling the Ollama api from a python app.
First, download a model:
```
curl -L https://huggingface.co/TheBloke/orca_mini_3B-GGML/resolve/main/orca-mini-3b.ggmlv3.q4_1.bin -o orca.bin
```
Then run it using the example script. You'll need to have Ollama running on your machine.
```
python3 main.py orca.bin
```

View File

@@ -1,32 +0,0 @@
import http.client
import json
import os
import sys
if len(sys.argv) < 2:
print("Usage: python main.py <model file>")
sys.exit(1)
conn = http.client.HTTPConnection('localhost', 11434)
headers = { 'Content-Type': 'application/json' }
# generate text from the model
conn.request("POST", "/api/generate", json.dumps({
'model': os.path.join(os.getcwd(), sys.argv[1]),
'prompt': 'write me a short story',
'stream': True
}), headers)
response = conn.getresponse()
def parse_generate(data):
for event in data.decode('utf-8').split("\n"):
if not event:
continue
yield event
if response.status == 200:
for chunk in response:
for event in parse_generate(chunk):
print(json.loads(event)['response'], end="", flush=True)

13
examples/recipemaker Normal file
View File

@@ -0,0 +1,13 @@
# Modelfile for creating a recipe from a list of ingredients
# Run `ollama create recipemaker -f pathtofile` and then `ollama run recipemaker` and feed it lists of ingredients to create recipes around.
FROM library/nous-hermes:latest
PROMPT """
{{- if not .Context }}
### System:
The instruction will be a list of ingredients. You should generate a recipe that can be made in less than an hour. You can also include ingredients that most people will find in their pantry every day. The recipe should be 4 people and you should include a description of what the meal will taste like
{{- end }}
### Instruction:
{{ .Prompt }}
### Response:
"""

14
examples/tweetwriter Normal file
View File

@@ -0,0 +1,14 @@
# Modelfile for creating a tweet from a topic
# Run `ollama create tweetwriter -f pathtofile` and then `ollama run tweetwriter` and enter a topic
FROM library/nous-hermes:latest
PROMPT """
{{- if not .Context }}
### System:
You are a content marketer who needs to come up with a short but succinct tweet. Make sure to include the appropriate hashtags and links. Sometimes when appropriate, describe a meme that can be includes as well. All answers should be in the form of a tweet which has a max size of 280 characters. Every instruction will be the topic to create a tweet about.
{{- end }}
### Instruction:
{{ .Prompt }}
### Response:
"""

141
format/time.go Normal file
View File

@@ -0,0 +1,141 @@
package format
import (
"fmt"
"math"
"strings"
"time"
)
// HumanDuration returns a human-readable approximation of a duration
// (eg. "About a minute", "4 hours ago", etc.).
// Modified version of github.com/docker/go-units.HumanDuration
func HumanDuration(d time.Duration) string {
return HumanDurationWithCase(d, true)
}
// HumanDurationWithCase returns a human-readable approximation of a
// duration (eg. "About a minute", "4 hours ago", etc.). but allows
// you to specify whether the first word should be capitalized
// (eg. "About" vs. "about")
func HumanDurationWithCase(d time.Duration, useCaps bool) string {
seconds := int(d.Seconds())
switch {
case seconds < 1:
if useCaps {
return "Less than a second"
}
return "less than a second"
case seconds == 1:
return "1 second"
case seconds < 60:
return fmt.Sprintf("%d seconds", seconds)
}
minutes := int(d.Minutes())
switch {
case minutes == 1:
if useCaps {
return "About a minute"
}
return "about a minute"
case minutes < 60:
return fmt.Sprintf("%d minutes", minutes)
}
hours := int(math.Round(d.Hours()))
switch {
case hours == 1:
if useCaps {
return "About an hour"
}
return "about an hour"
case hours < 48:
return fmt.Sprintf("%d hours", hours)
case hours < 24*7*2:
return fmt.Sprintf("%d days", hours/24)
case hours < 24*30*2:
return fmt.Sprintf("%d weeks", hours/24/7)
case hours < 24*365*2:
return fmt.Sprintf("%d months", hours/24/30)
}
return fmt.Sprintf("%d years", int(d.Hours())/24/365)
}
func HumanTime(t time.Time, zeroValue string) string {
return humanTimeWithCase(t, zeroValue, true)
}
func HumanTimeLower(t time.Time, zeroValue string) string {
return humanTimeWithCase(t, zeroValue, false)
}
func humanTimeWithCase(t time.Time, zeroValue string, useCaps bool) string {
if t.IsZero() {
return zeroValue
}
delta := time.Since(t)
if delta < 0 {
return HumanDurationWithCase(-delta, useCaps) + " from now"
}
return HumanDurationWithCase(delta, useCaps) + " ago"
}
// ExcatDuration returns a human readable hours/minutes/seconds or milliseconds format of a duration
// the most precise level of duration is milliseconds
func ExactDuration(d time.Duration) string {
if d.Seconds() < 1 {
if d.Milliseconds() == 1 {
return fmt.Sprintf("%d millisecond", d.Milliseconds())
}
return fmt.Sprintf("%d milliseconds", d.Milliseconds())
}
var readableDur strings.Builder
dur := d.String()
// split the default duration string format of 0h0m0s into something nicer to read
h := strings.Split(dur, "h")
if len(h) > 1 {
hours := h[0]
if hours == "1" {
readableDur.WriteString(fmt.Sprintf("%s hour ", hours))
} else {
readableDur.WriteString(fmt.Sprintf("%s hours ", hours))
}
dur = h[1]
}
m := strings.Split(dur, "m")
if len(m) > 1 {
mins := m[0]
switch mins {
case "0":
// skip
case "1":
readableDur.WriteString(fmt.Sprintf("%s minute ", mins))
default:
readableDur.WriteString(fmt.Sprintf("%s minutes ", mins))
}
dur = m[1]
}
s := strings.Split(dur, "s")
if len(s) > 0 {
sec := s[0]
switch sec {
case "0":
// skip
case "1":
readableDur.WriteString(fmt.Sprintf("%s second ", sec))
default:
readableDur.WriteString(fmt.Sprintf("%s seconds ", sec))
}
}
return strings.TrimSpace(readableDur.String())
}

102
format/time_test.go Normal file
View File

@@ -0,0 +1,102 @@
package format
import (
"testing"
"time"
)
func assertEqual(t *testing.T, a interface{}, b interface{}) {
if a != b {
t.Errorf("Assert failed, expected %v, got %v", b, a)
}
}
func TestHumanDuration(t *testing.T) {
day := 24 * time.Hour
week := 7 * day
month := 30 * day
year := 365 * day
assertEqual(t, "Less than a second", HumanDuration(450*time.Millisecond))
assertEqual(t, "Less than a second", HumanDurationWithCase(450*time.Millisecond, true))
assertEqual(t, "less than a second", HumanDurationWithCase(450*time.Millisecond, false))
assertEqual(t, "1 second", HumanDuration(1*time.Second))
assertEqual(t, "45 seconds", HumanDuration(45*time.Second))
assertEqual(t, "46 seconds", HumanDuration(46*time.Second))
assertEqual(t, "59 seconds", HumanDuration(59*time.Second))
assertEqual(t, "About a minute", HumanDuration(60*time.Second))
assertEqual(t, "About a minute", HumanDurationWithCase(1*time.Minute, true))
assertEqual(t, "about a minute", HumanDurationWithCase(1*time.Minute, false))
assertEqual(t, "3 minutes", HumanDuration(3*time.Minute))
assertEqual(t, "35 minutes", HumanDuration(35*time.Minute))
assertEqual(t, "35 minutes", HumanDuration(35*time.Minute+40*time.Second))
assertEqual(t, "45 minutes", HumanDuration(45*time.Minute))
assertEqual(t, "45 minutes", HumanDuration(45*time.Minute+40*time.Second))
assertEqual(t, "46 minutes", HumanDuration(46*time.Minute))
assertEqual(t, "59 minutes", HumanDuration(59*time.Minute))
assertEqual(t, "About an hour", HumanDuration(1*time.Hour))
assertEqual(t, "About an hour", HumanDurationWithCase(1*time.Hour+29*time.Minute, true))
assertEqual(t, "about an hour", HumanDurationWithCase(1*time.Hour+29*time.Minute, false))
assertEqual(t, "2 hours", HumanDuration(1*time.Hour+31*time.Minute))
assertEqual(t, "2 hours", HumanDuration(1*time.Hour+59*time.Minute))
assertEqual(t, "3 hours", HumanDuration(3*time.Hour))
assertEqual(t, "3 hours", HumanDuration(3*time.Hour+29*time.Minute))
assertEqual(t, "4 hours", HumanDuration(3*time.Hour+31*time.Minute))
assertEqual(t, "4 hours", HumanDuration(3*time.Hour+59*time.Minute))
assertEqual(t, "4 hours", HumanDuration(3*time.Hour+60*time.Minute))
assertEqual(t, "24 hours", HumanDuration(24*time.Hour))
assertEqual(t, "36 hours", HumanDuration(1*day+12*time.Hour))
assertEqual(t, "2 days", HumanDuration(2*day))
assertEqual(t, "7 days", HumanDuration(7*day))
assertEqual(t, "13 days", HumanDuration(13*day+5*time.Hour))
assertEqual(t, "2 weeks", HumanDuration(2*week))
assertEqual(t, "2 weeks", HumanDuration(2*week+4*day))
assertEqual(t, "3 weeks", HumanDuration(3*week))
assertEqual(t, "4 weeks", HumanDuration(4*week))
assertEqual(t, "4 weeks", HumanDuration(4*week+3*day))
assertEqual(t, "4 weeks", HumanDuration(1*month))
assertEqual(t, "6 weeks", HumanDuration(1*month+2*week))
assertEqual(t, "2 months", HumanDuration(2*month))
assertEqual(t, "2 months", HumanDuration(2*month+2*week))
assertEqual(t, "3 months", HumanDuration(3*month))
assertEqual(t, "3 months", HumanDuration(3*month+1*week))
assertEqual(t, "5 months", HumanDuration(5*month+2*week))
assertEqual(t, "13 months", HumanDuration(13*month))
assertEqual(t, "23 months", HumanDuration(23*month))
assertEqual(t, "24 months", HumanDuration(24*month))
assertEqual(t, "2 years", HumanDuration(24*month+2*week))
assertEqual(t, "3 years", HumanDuration(3*year+2*month))
}
func TestHumanTime(t *testing.T) {
now := time.Now()
t.Run("zero value", func(t *testing.T) {
assertEqual(t, HumanTime(time.Time{}, "never"), "never")
})
t.Run("time in the future", func(t *testing.T) {
v := now.Add(48 * time.Hour)
assertEqual(t, HumanTime(v, ""), "2 days from now")
})
t.Run("time in the past", func(t *testing.T) {
v := now.Add(-48 * time.Hour)
assertEqual(t, HumanTime(v, ""), "2 days ago")
})
}
func TestExactDuration(t *testing.T) {
assertEqual(t, "1 millisecond", ExactDuration(1*time.Millisecond))
assertEqual(t, "10 milliseconds", ExactDuration(10*time.Millisecond))
assertEqual(t, "1 second", ExactDuration(1*time.Second))
assertEqual(t, "10 seconds", ExactDuration(10*time.Second))
assertEqual(t, "1 minute", ExactDuration(1*time.Minute))
assertEqual(t, "10 minutes", ExactDuration(10*time.Minute))
assertEqual(t, "1 hour", ExactDuration(1*time.Hour))
assertEqual(t, "10 hours", ExactDuration(10*time.Hour))
assertEqual(t, "1 hour 1 second", ExactDuration(1*time.Hour+1*time.Second))
assertEqual(t, "1 hour 10 seconds", ExactDuration(1*time.Hour+10*time.Second))
assertEqual(t, "1 hour 1 minute", ExactDuration(1*time.Hour+1*time.Minute))
assertEqual(t, "1 hour 10 minutes", ExactDuration(1*time.Hour+10*time.Minute))
assertEqual(t, "1 hour 1 minute 1 second", ExactDuration(1*time.Hour+1*time.Minute+1*time.Second))
assertEqual(t, "10 hours 10 minutes 10 seconds", ExactDuration(10*time.Hour+10*time.Minute+10*time.Second))
}

2
go.mod
View File

@@ -3,7 +3,9 @@ module github.com/jmorganca/ollama
go 1.20
require (
github.com/dustin/go-humanize v1.0.1
github.com/gin-gonic/gin v1.9.1
github.com/olekukonko/tablewriter v0.0.5
github.com/spf13/cobra v1.7.0
)

5
go.sum
View File

@@ -10,6 +10,8 @@ github.com/cpuguy83/go-md2man/v2 v2.0.2/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46t
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
github.com/gabriel-vasile/mimetype v1.4.2 h1:w5qFW6JKBz9Y393Y4q372O9A7cUSequkh1Q7OhCmWKU=
github.com/gabriel-vasile/mimetype v1.4.2/go.mod h1:zApsH/mKG4w07erKIaJPFiX0Tsq9BFQgN3qGY5GnNgA=
github.com/gin-contrib/sse v0.1.0 h1:Y/yl/+YNO8GZSjAhjMsSuLt29uWRFHdHYUb5lYOV9qE=
@@ -43,6 +45,7 @@ github.com/leodido/go-urn v1.2.4/go.mod h1:7ZrI8mTSeBSHl/UaRyKQW1qZeMgak41ANeCNa
github.com/mattn/go-isatty v0.0.17/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
github.com/mattn/go-isatty v0.0.19 h1:JITubQf0MOLdlGRuRq+jtsDlekdYPia9ZFsB8h/APPA=
github.com/mattn/go-isatty v0.0.19/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/mattn/go-runewidth v0.0.9/go.mod h1:H031xJmbD/WCDINGzjvQ9THkh0rPKHF+m2gUSrubnMI=
github.com/mattn/go-runewidth v0.0.14 h1:+xnbZSEeDbOIg5/mE6JF0w6n9duR1l3/WmbinWVwUuU=
github.com/mattn/go-runewidth v0.0.14/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db h1:62I3jR2EmQ4l5rM/4FEfDWcRD+abF5XlKShorW5LRoQ=
@@ -52,6 +55,8 @@ github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/olekukonko/tablewriter v0.0.5 h1:P2Ga83D34wi1o9J6Wh1mRuqd4mF/x/lgBS7N7AbDhec=
github.com/olekukonko/tablewriter v0.0.5/go.mod h1:hPp6KlRPjbx+hW8ykQs1w3UBbZlj6HuIJcUGPhkA7kY=
github.com/pelletier/go-toml/v2 v2.0.8 h1:0ctb6s9mE31h0/lhu+J6OPmVeDxJn+kYnJc2jZR9tGQ=
github.com/pelletier/go-toml/v2 v2.0.8/go.mod h1:vuYfssBdrU2XDZ9bYydBu6t+6a6PYNcZljzZR9VXg+4=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=

View File

@@ -38,7 +38,7 @@ func Parse(reader io.Reader) ([]Command, error) {
}
command := Command{}
switch fields[0] {
switch strings.ToUpper(fields[0]) {
case "FROM":
command.Name = "model"
command.Arg = fields[1]
@@ -46,8 +46,8 @@ func Parse(reader io.Reader) ([]Command, error) {
return nil, fmt.Errorf("no model specified in FROM line")
}
foundModel = true
case "PROMPT":
command.Name = "prompt"
case "PROMPT", "LICENSE":
command.Name = strings.ToLower(fields[0])
if fields[1] == `"""` {
multiline = true
multilineCommand = &command

View File

@@ -3,7 +3,6 @@ package server
import (
"bytes"
"crypto/sha256"
"encoding/hex"
"encoding/json"
"errors"
"fmt"
@@ -22,8 +21,6 @@ import (
"github.com/jmorganca/ollama/parser"
)
var DefaultRegistry string = "https://registry.ollama.ai"
type Model struct {
Name string `json:"name"`
ModelPath string
@@ -44,10 +41,9 @@ type Layer struct {
Size int `json:"size"`
}
type LayerWithBuffer struct {
type LayerReader struct {
Layer
Buffer *bytes.Buffer
io.Reader
}
type ConfigV2 struct {
@@ -61,27 +57,22 @@ type RootFS struct {
DiffIDs []string `json:"diff_ids"`
}
func modelsDir(part ...string) (string, error) {
home, err := os.UserHomeDir()
if err != nil {
return "", err
func (m *ManifestV2) GetTotalSize() int {
var total int
for _, layer := range m.Layers {
total += layer.Size
}
path := filepath.Join(home, ".ollama", "models", filepath.Join(part...))
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
return "", err
}
return path, nil
total += m.Config.Size
return total
}
func GetManifest(name string) (*ManifestV2, error) {
fp, err := modelsDir("manifests", name)
func GetManifest(mp ModelPath) (*ManifestV2, error) {
fp, err := mp.GetManifestPath(false)
if err != nil {
return nil, err
}
if _, err = os.Stat(fp); err != nil && !errors.Is(err, os.ErrNotExist) {
return nil, fmt.Errorf("couldn't find model '%s'", name)
return nil, fmt.Errorf("couldn't find model '%s'", mp.GetShortTagname())
}
var manifest *ManifestV2
@@ -101,17 +92,19 @@ func GetManifest(name string) (*ManifestV2, error) {
}
func GetModel(name string) (*Model, error) {
manifest, err := GetManifest(name)
mp := ParseModelPath(name)
manifest, err := GetManifest(mp)
if err != nil {
return nil, err
}
model := &Model{
Name: name,
Name: mp.GetFullTagname(),
}
for _, layer := range manifest.Layers {
filename, err := modelsDir("blobs", layer.Digest)
filename, err := GetBlobsPath(layer.Digest)
if err != nil {
return nil, err
}
@@ -166,7 +159,7 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
return err
}
var layers []*LayerWithBuffer
var layers []*LayerReader
params := make(map[string]string)
for _, c := range commands {
@@ -174,7 +167,7 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
switch c.Name {
case "model":
fn("looking for model")
mf, err := GetManifest(c.Arg)
mf, err := GetManifest(ParseModelPath(c.Arg))
if err != nil {
// if we couldn't read the manifest, try getting the bin file
fp, err := getAbsPath(c.Arg)
@@ -222,6 +215,16 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
}
l.MediaType = "application/vnd.ollama.image.prompt"
layers = append(layers, l)
case "license":
fn("creating license layer")
license := strings.NewReader(c.Arg)
l, err := CreateLayer(license)
if err != nil {
fn(fmt.Sprintf("couldn't create license layer: %v", err))
return fmt.Errorf("failed to create layer: %v", err)
}
l.MediaType = "application/vnd.ollama.image.license"
layers = append(layers, l)
default:
params[c.Name] = c.Arg
}
@@ -279,7 +282,7 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
return nil
}
func removeLayerFromLayers(layers []*LayerWithBuffer, mediaType string) []*LayerWithBuffer {
func removeLayerFromLayers(layers []*LayerReader, mediaType string) []*LayerReader {
j := 0
for _, l := range layers {
if l.MediaType != mediaType {
@@ -290,10 +293,10 @@ func removeLayerFromLayers(layers []*LayerWithBuffer, mediaType string) []*Layer
return layers[:j]
}
func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) error {
func SaveLayers(layers []*LayerReader, fn func(status string), force bool) error {
// Write each of the layers to disk
for _, layer := range layers {
fp, err := modelsDir("blobs", layer.Digest)
fp, err := GetBlobsPath(layer.Digest)
if err != nil {
return err
}
@@ -308,10 +311,10 @@ func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) e
}
defer out.Close()
_, err = io.Copy(out, layer.Buffer)
if err != nil {
if _, err = io.Copy(out, layer.Reader); err != nil {
return err
}
} else {
fn(fmt.Sprintf("using already created layer %s", layer.Digest))
}
@@ -320,7 +323,9 @@ func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) e
return nil
}
func CreateManifest(name string, cfg *LayerWithBuffer, layers []*Layer) error {
func CreateManifest(name string, cfg *LayerReader, layers []*Layer) error {
mp := ParseModelPath(name)
manifest := ManifestV2{
SchemaVersion: 2,
MediaType: "application/vnd.docker.distribution.manifest.v2+json",
@@ -337,15 +342,15 @@ func CreateManifest(name string, cfg *LayerWithBuffer, layers []*Layer) error {
return err
}
fp, err := modelsDir("manifests", name)
fp, err := mp.GetManifestPath(true)
if err != nil {
return err
}
return os.WriteFile(fp, manifestJSON, 0o644)
}
func GetLayerWithBufferFromLayer(layer *Layer) (*LayerWithBuffer, error) {
fp, err := modelsDir("blobs", layer.Digest)
func GetLayerWithBufferFromLayer(layer *Layer) (*LayerReader, error) {
fp, err := GetBlobsPath(layer.Digest)
if err != nil {
return nil, err
}
@@ -364,7 +369,7 @@ func GetLayerWithBufferFromLayer(layer *Layer) (*LayerWithBuffer, error) {
return newLayer, nil
}
func paramsToReader(params map[string]string) (io.Reader, error) {
func paramsToReader(params map[string]string) (io.ReadSeeker, error) {
opts := api.DefaultOptions()
typeOpts := reflect.TypeOf(opts)
@@ -422,7 +427,7 @@ func paramsToReader(params map[string]string) (io.Reader, error) {
return bytes.NewReader(bts), nil
}
func getLayerDigests(layers []*LayerWithBuffer) ([]string, error) {
func getLayerDigests(layers []*LayerReader) ([]string, error) {
var digests []string
for _, l := range layers {
if l.Digest == "" {
@@ -434,50 +439,33 @@ func getLayerDigests(layers []*LayerWithBuffer) ([]string, error) {
}
// CreateLayer creates a Layer object from a given file
func CreateLayer(f io.Reader) (*LayerWithBuffer, error) {
buf := new(bytes.Buffer)
_, err := io.Copy(buf, f)
if err != nil {
return nil, err
}
func CreateLayer(f io.ReadSeeker) (*LayerReader, error) {
digest, size := GetSHA256Digest(f)
f.Seek(0, 0)
digest, size := GetSHA256Digest(buf)
layer := &LayerWithBuffer{
layer := &LayerReader{
Layer: Layer{
MediaType: "application/vnd.docker.image.rootfs.diff.tar",
Digest: digest,
Size: size,
},
Buffer: buf,
Reader: f,
}
return layer, nil
}
func PushModel(name, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
fn("retrieving manifest", "", 0, 0, 0)
manifest, err := GetManifest(name)
func PushModel(name, username, password string, fn func(api.ProgressResponse)) error {
mp := ParseModelPath(name)
fn(api.ProgressResponse{Status: "retrieving manifest"})
manifest, err := GetManifest(mp)
if err != nil {
fn("couldn't retrieve manifest", "", 0, 0, 0)
fn(api.ProgressResponse{Status: "couldn't retrieve manifest"})
return err
}
var repoName string
var tag string
comps := strings.Split(name, ":")
switch {
case len(comps) < 1 || len(comps) > 2:
return fmt.Errorf("repository name was invalid")
case len(comps) == 1:
repoName = comps[0]
tag = "latest"
case len(comps) == 2:
repoName = comps[0]
tag = comps[1]
}
var layers []*Layer
var total int
var completed int
@@ -489,20 +477,30 @@ func PushModel(name, username, password string, fn func(status, digest string, T
total += manifest.Config.Size
for _, layer := range layers {
exists, err := checkBlobExistence(DefaultRegistry, repoName, layer.Digest, username, password)
exists, err := checkBlobExistence(mp, layer.Digest, username, password)
if err != nil {
return err
}
if exists {
completed += layer.Size
fn("using existing layer", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "using existing layer",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
continue
}
fn("starting upload", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "starting upload",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
location, err := startUpload(DefaultRegistry, repoName, username, password)
location, err := startUpload(mp, username, password)
if err != nil {
log.Printf("couldn't start upload: %v", err)
return err
@@ -514,11 +512,20 @@ func PushModel(name, username, password string, fn func(status, digest string, T
return err
}
completed += layer.Size
fn("upload complete", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "upload complete",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
}
fn("pushing manifest", "", total, completed, float64(completed/total))
url := fmt.Sprintf("%s/v2/%s/manifests/%s", DefaultRegistry, repoName, tag)
fn(api.ProgressResponse{
Status: "pushing manifest",
Total: total,
Completed: completed,
})
url := fmt.Sprintf("%s://%s/v2/%s/manifests/%s", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository(), mp.Tag)
headers := map[string]string{
"Content-Type": "application/vnd.docker.distribution.manifest.v2+json",
}
@@ -540,36 +547,25 @@ func PushModel(name, username, password string, fn func(status, digest string, T
return fmt.Errorf("registry responded with code %d: %v", resp.StatusCode, string(body))
}
fn("success", "", total, completed, 1.0)
fn(api.ProgressResponse{
Status: "success",
Total: total,
Completed: completed,
})
return nil
}
func PullModel(name, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
var repoName string
var tag string
func PullModel(name, username, password string, fn func(api.ProgressResponse)) error {
mp := ParseModelPath(name)
comps := strings.Split(name, ":")
switch {
case len(comps) < 1 || len(comps) > 2:
return fmt.Errorf("repository name was invalid")
case len(comps) == 1:
repoName = comps[0]
tag = "latest"
case len(comps) == 2:
repoName = comps[0]
tag = comps[1]
}
fn(api.ProgressResponse{Status: "pulling manifest"})
fn("pulling manifest", "", 0, 0, 0)
manifest, err := pullModelManifest(DefaultRegistry, repoName, tag, username, password)
manifest, err := pullModelManifest(mp, username, password)
if err != nil {
return fmt.Errorf("pull model manifest: %q", err)
}
log.Printf("manifest = %#v", manifest)
var layers []*Layer
var total int
var completed int
@@ -581,45 +577,39 @@ func PullModel(name, username, password string, fn func(status, digest string, T
total += manifest.Config.Size
for _, layer := range layers {
fn("starting download", layer.Digest, total, completed, float64(completed)/float64(total))
if err := downloadBlob(DefaultRegistry, repoName, layer.Digest, username, password, fn); err != nil {
fn(fmt.Sprintf("error downloading: %v", err), layer.Digest, 0, 0, 0)
if err := downloadBlob(mp, layer.Digest, username, password, fn); err != nil {
fn(api.ProgressResponse{Status: fmt.Sprintf("error downloading: %v", err), Digest: layer.Digest})
return err
}
completed += layer.Size
fn("download complete", layer.Digest, total, completed, float64(completed)/float64(total))
}
fn("writing manifest", "", total, completed, 1.0)
fn(api.ProgressResponse{Status: "writing manifest"})
manifestJSON, err := json.Marshal(manifest)
if err != nil {
return err
}
fp, err := modelsDir("manifests", name)
fp, err := mp.GetManifestPath(true)
if err != nil {
return err
}
err = os.MkdirAll(path.Dir(fp), 0o700)
if err != nil {
return fmt.Errorf("make manifests directory: %w", err)
}
err = os.WriteFile(fp, manifestJSON, 0644)
if err != nil {
log.Printf("couldn't write to %s", fp)
return err
}
fn("success", "", total, completed, 1.0)
fn(api.ProgressResponse{Status: "success"})
return nil
}
func pullModelManifest(registryURL, repoName, tag, username, password string) (*ManifestV2, error) {
url := fmt.Sprintf("%s/v2/%s/manifests/%s", registryURL, repoName, tag)
func pullModelManifest(mp ModelPath, username, password string) (*ManifestV2, error) {
url := fmt.Sprintf("%s://%s/v2/%s/manifests/%s", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository(), mp.Tag)
headers := map[string]string{
"Accept": "application/vnd.docker.distribution.manifest.v2+json",
}
@@ -645,7 +635,7 @@ func pullModelManifest(registryURL, repoName, tag, username, password string) (*
return m, err
}
func createConfigLayer(layers []string) (*LayerWithBuffer, error) {
func createConfigLayer(layers []string) (*LayerReader, error) {
// TODO change architecture and OS
config := ConfigV2{
Architecture: "arm64",
@@ -661,29 +651,32 @@ func createConfigLayer(layers []string) (*LayerWithBuffer, error) {
return nil, err
}
buf := bytes.NewBuffer(configJSON)
digest, size := GetSHA256Digest(buf)
digest, size := GetSHA256Digest(bytes.NewBuffer(configJSON))
layer := &LayerWithBuffer{
layer := &LayerReader{
Layer: Layer{
MediaType: "application/vnd.docker.container.image.v1+json",
Digest: digest,
Size: size,
},
Buffer: buf,
Reader: bytes.NewBuffer(configJSON),
}
return layer, nil
}
// GetSHA256Digest returns the SHA256 hash of a given buffer and returns it, and the size of buffer
func GetSHA256Digest(data *bytes.Buffer) (string, int) {
layerBytes := data.Bytes()
hash := sha256.Sum256(layerBytes)
return "sha256:" + hex.EncodeToString(hash[:]), len(layerBytes)
func GetSHA256Digest(r io.Reader) (string, int) {
h := sha256.New()
n, err := io.Copy(h, r)
if err != nil {
log.Fatal(err)
}
return fmt.Sprintf("sha256:%x", h.Sum(nil)), int(n)
}
func startUpload(registryURL string, repositoryName string, username string, password string) (string, error) {
url := fmt.Sprintf("%s/v2/%s/blobs/uploads/", registryURL, repositoryName)
func startUpload(mp ModelPath, username string, password string) (string, error) {
url := fmt.Sprintf("%s://%s/v2/%s/blobs/uploads/", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository())
resp, err := makeRequest("POST", url, nil, nil, username, password)
if err != nil {
@@ -708,8 +701,8 @@ func startUpload(registryURL string, repositoryName string, username string, pas
}
// Function to check if a blob already exists in the Docker registry
func checkBlobExistence(registryURL string, repositoryName string, digest string, username string, password string) (bool, error) {
url := fmt.Sprintf("%s/v2/%s/blobs/%s", registryURL, repositoryName, digest)
func checkBlobExistence(mp ModelPath, digest string, username string, password string) (bool, error) {
url := fmt.Sprintf("%s://%s/v2/%s/blobs/%s", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository(), digest)
resp, err := makeRequest("HEAD", url, nil, nil, username, password)
if err != nil {
@@ -735,7 +728,7 @@ func uploadBlob(location string, layer *Layer, username string, password string)
// TODO allow canceling uploads via DELETE
// TODO allow cross repo blob mount
fp, err := modelsDir("blobs", layer.Digest)
fp, err := GetBlobsPath(layer.Digest)
if err != nil {
return err
}
@@ -761,16 +754,20 @@ func uploadBlob(location string, layer *Layer, username string, password string)
return nil
}
func downloadBlob(registryURL, repoName, digest string, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
fp, err := modelsDir("blobs", digest)
func downloadBlob(mp ModelPath, digest string, username, password string, fn func(api.ProgressResponse)) error {
fp, err := GetBlobsPath(digest)
if err != nil {
return err
}
_, err = os.Stat(fp)
if !os.IsNotExist(err) {
if fi, _ := os.Stat(fp); fi != nil {
// we already have the file, so return
log.Printf("already have %s\n", digest)
fn(api.ProgressResponse{
Digest: digest,
Total: int(fi.Size()),
Completed: int(fi.Size()),
})
return nil
}
@@ -786,7 +783,7 @@ func downloadBlob(registryURL, repoName, digest string, username, password strin
size = fi.Size()
}
url := fmt.Sprintf("%s/v2/%s/blobs/%s", registryURL, repoName, digest)
url := fmt.Sprintf("%s://%s/v2/%s/blobs/%s", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository(), digest)
headers := map[string]string{
"Range": fmt.Sprintf("bytes=%d-", size),
}
@@ -819,10 +816,21 @@ func downloadBlob(registryURL, repoName, digest string, username, password strin
total := remaining + completed
for {
fn(fmt.Sprintf("Downloading %s", digest), digest, int(total), int(completed), float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: fmt.Sprintf("downloading %s", digest),
Digest: digest,
Total: int(total),
Completed: int(completed),
})
if completed >= total {
if err := os.Rename(fp+"-partial", fp); err != nil {
fn(fmt.Sprintf("error renaming file: %v", err), digest, int(total), int(completed), 1)
fn(api.ProgressResponse{
Status: fmt.Sprintf("error renaming file: %v", err),
Digest: digest,
Total: int(total),
Completed: int(completed),
})
return err
}

115
server/modelpath.go Normal file
View File

@@ -0,0 +1,115 @@
package server
import (
"fmt"
"os"
"path/filepath"
"strings"
)
type ModelPath struct {
ProtocolScheme string
Registry string
Namespace string
Repository string
Tag string
}
const (
DefaultRegistry = "registry.ollama.ai"
DefaultNamespace = "library"
DefaultTag = "latest"
DefaultProtocolScheme = "https"
)
func ParseModelPath(name string) ModelPath {
slashParts := strings.Split(name, "/")
var registry, namespace, repository, tag string
switch len(slashParts) {
case 3:
registry = slashParts[0]
namespace = slashParts[1]
repository = strings.Split(slashParts[2], ":")[0]
case 2:
registry = DefaultRegistry
namespace = slashParts[0]
repository = strings.Split(slashParts[1], ":")[0]
case 1:
registry = DefaultRegistry
namespace = DefaultNamespace
repository = strings.Split(slashParts[0], ":")[0]
default:
fmt.Println("Invalid image format.")
return ModelPath{}
}
colonParts := strings.Split(name, ":")
if len(colonParts) == 2 {
tag = colonParts[1]
} else {
tag = DefaultTag
}
return ModelPath{
ProtocolScheme: DefaultProtocolScheme,
Registry: registry,
Namespace: namespace,
Repository: repository,
Tag: tag,
}
}
func (mp ModelPath) GetNamespaceRepository() string {
return fmt.Sprintf("%s/%s", mp.Namespace, mp.Repository)
}
func (mp ModelPath) GetFullTagname() string {
return fmt.Sprintf("%s/%s/%s:%s", mp.Registry, mp.Namespace, mp.Repository, mp.Tag)
}
func (mp ModelPath) GetShortTagname() string {
if mp.Registry == DefaultRegistry && mp.Namespace == DefaultNamespace {
return fmt.Sprintf("%s:%s", mp.Repository, mp.Tag)
}
return fmt.Sprintf("%s/%s:%s", mp.Namespace, mp.Repository, mp.Tag)
}
func (mp ModelPath) GetManifestPath(createDir bool) (string, error) {
home, err := os.UserHomeDir()
if err != nil {
return "", err
}
path := filepath.Join(home, ".ollama", "models", "manifests", mp.Registry, mp.Namespace, mp.Repository, mp.Tag)
if createDir {
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
return "", err
}
}
return path, nil
}
func GetManifestPath() (string, error) {
home, err := os.UserHomeDir()
if err != nil {
return "", err
}
return filepath.Join(home, ".ollama", "models", "manifests"), nil
}
func GetBlobsPath(digest string) (string, error) {
home, err := os.UserHomeDir()
if err != nil {
return "", err
}
path := filepath.Join(home, ".ollama", "models", "blobs", digest)
if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil {
return "", err
}
return path, nil
}

View File

@@ -101,15 +101,10 @@ func pull(c *gin.Context) {
ch := make(chan any)
go func() {
defer close(ch)
fn := func(status, digest string, total, completed int, percent float64) {
ch <- api.PullProgress{
Status: status,
Digest: digest,
Total: total,
Completed: completed,
Percent: percent,
}
fn := func(r api.ProgressResponse) {
ch <- r
}
if err := PullModel(req.Name, req.Username, req.Password, fn); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
@@ -129,15 +124,10 @@ func push(c *gin.Context) {
ch := make(chan any)
go func() {
defer close(ch)
fn := func(status, digest string, total, completed int, percent float64) {
ch <- api.PushProgress{
Status: status,
Digest: digest,
Total: total,
Completed: completed,
Percent: percent,
}
fn := func(r api.ProgressResponse) {
ch <- r
}
if err := PushModel(req.Name, req.Username, req.Password, fn); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
@@ -181,6 +171,52 @@ func create(c *gin.Context) {
streamResponse(c, ch)
}
func list(c *gin.Context) {
var models []api.ListResponseModel
fp, err := GetManifestPath()
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
}
err = filepath.Walk(fp, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if !info.IsDir() {
fi, err := os.Stat(path)
if err != nil {
log.Printf("skipping file: %s", fp)
return nil
}
path := path[len(fp)+1:]
slashIndex := strings.LastIndex(path, "/")
if slashIndex == -1 {
return nil
}
tag := path[:slashIndex] + ":" + path[slashIndex+1:]
mp := ParseModelPath(tag)
manifest, err := GetManifest(mp)
if err != nil {
log.Printf("skipping file: %s", fp)
return nil
}
model := api.ListResponseModel{
Name: mp.GetShortTagname(),
Size: manifest.GetTotalSize(),
ModifiedAt: fi.ModTime(),
}
models = append(models, model)
}
return nil
})
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
}
c.JSON(http.StatusOK, api.ListResponse{models})
}
func Serve(ln net.Listener) error {
r := gin.Default()
@@ -192,6 +228,7 @@ func Serve(ln net.Listener) error {
r.POST("/api/generate", generate)
r.POST("/api/create", create)
r.POST("/api/push", push)
r.GET("/api/tags", list)
log.Printf("Listening on %s", ln.Addr())
s := &http.Server{

View File

@@ -1,3 +1,6 @@
import Image from 'next/image'
import Header from '../header'
import Downloader from './downloader'
import Signup from './signup'
@@ -26,22 +29,19 @@ export default async function Download() {
}
return (
<main className='flex min-h-screen max-w-2xl flex-col p-4 lg:p-24 items-center mx-auto'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='my-12 text-center'>
<h2 className='my-2 max-w-md text-3xl tracking-tight'>Downloading Ollama</h2>
<h3 className='text-sm text-neutral-500'>
Problems downloading?{' '}
<a href={asset.browser_download_url} className='underline'>
Try again
</a>
</h3>
<Downloader url={asset.browser_download_url} />
</section>
<section className='max-w-sm flex flex-col w-full items-center border border-neutral-200 rounded-xl px-8 pt-8 pb-2'>
<p className='text-lg leading-tight text-center mb-6 max-w-[260px]'>Sign up for updates</p>
<>
<Header />
<main className='flex min-h-screen max-w-6xl flex-col py-20 px-16 lg:p-32 items-center mx-auto'>
<Image src='/ollama.png' width={64} height={64} alt='ollamaIcon' />
<section className='mt-12 mb-8 text-center'>
<h2 className='my-2 max-w-md text-3xl tracking-tight'>Downloading...</h2>
<h3 className='text-base text-neutral-500 mt-12 max-w-[16rem]'>
While Ollama downloads, sign up to get notified of new updates.
</h3>
<Downloader url={asset.browser_download_url} />
</section>
<Signup />
</section>
</main>
</main>
</>
)
}

View File

@@ -28,7 +28,7 @@ export default function Signup() {
return false
}}
className='flex self-stretch flex-col gap-3 h-32'
className='flex self-stretch flex-col gap-3 h-32 md:mx-40 lg:mx-72'
>
<input
required
@@ -37,13 +37,13 @@ export default function Signup() {
onChange={e => setEmail(e.target.value)}
type='email'
placeholder='your@email.com'
className='bg-neutral-100 rounded-lg px-4 py-2 focus:outline-none placeholder-neutral-500'
className='border border-neutral-200 rounded-lg px-4 py-2 focus:outline-none placeholder-neutral-300'
/>
<input
type='submit'
value='Get updates'
disabled={submitting}
className='bg-black text-white disabled:text-neutral-200 disabled:bg-neutral-700 rounded-lg px-4 py-2 focus:outline-none cursor-pointer'
className='bg-black text-white disabled:text-neutral-200 disabled:bg-neutral-700 rounded-full px-4 py-2 focus:outline-none cursor-pointer'
/>
{success && <p className='text-center text-sm'>You&apos;re signed up for updates</p>}
</form>

25
web/app/header.tsx Normal file
View File

@@ -0,0 +1,25 @@
import Link from "next/link"
const navigation = [
{ name: 'Github', href: 'https://github.com/jmorganca/ollama' },
{ name: 'Download', href: '/download' },
]
export default function Header() {
return (
<header className="absolute inset-x-0 top-0 z-50">
<nav className="mx-auto flex items-center justify-between px-10 py-4">
<Link className="flex-1 font-bold" href="/">
Ollama
</Link>
<div className="flex space-x-8">
{navigation.map((item) => (
<Link key={item.name} href={item.href} className="text-sm leading-6 text-gray-900">
{item.name}
</Link>
))}
</div>
</nav>
</header >
)
}

View File

@@ -1,34 +1,32 @@
import { AiFillApple } from 'react-icons/ai'
import Image from 'next/image'
import Link from 'next/link'
import models from '../../models.json'
import Header from './header'
export default async function Home() {
return (
<main className='flex min-h-screen max-w-2xl flex-col p-4 lg:p-24'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='my-4'>
<p className='my-3 max-w-md'>
<a className='underline' href='https://github.com/jmorganca/ollama'>
Ollama
</a>{' '}
is a tool for running large language models, currently for macOS with Windows and Linux coming soon.
<br />
<br />
<a href='/download'>
<button className='bg-black text-white text-sm py-2 px-3 rounded-lg flex items-center gap-2'>
<AiFillApple className='h-auto w-5 relative -top-px' /> Download for macOS
</button>
</a>
</p>
</section>
<section className='my-4'>
<h2 className='mb-4 text-lg'>Example models you can try running:</h2>
{models.map(m => (
<div className='my-2 grid font-mono' key={m.name}>
<code className='py-0.5'>ollama run {m.name}</code>
<>
<Header />
<main className='flex min-h-screen max-w-6xl flex-col py-20 px-16 md:p-32 items-center mx-auto'>
<Image src='/ollama.png' width={64} height={64} alt='ollamaIcon' />
<section className='my-12 text-center'>
<div className='flex flex-col space-y-2'>
<h2 className='md:max-w-[18rem] mx-auto my-2 text-3xl tracking-tight'>Portable large language models</h2>
<h3 className='md:max-w-xs mx-auto text-base text-neutral-500'>
Bundle a models weights, configuration, prompts, data and more into self-contained packages that run anywhere.
</h3>
</div>
))}
</section>
</main>
<div className='mx-auto flex flex-col space-y-4 mt-12'>
<Link href='/download' className='md:mx-10 lg:mx-14 bg-black text-white rounded-full px-4 py-2 focus:outline-none cursor-pointer'>
Download
</Link>
<p className='text-neutral-500 text-sm '>
Available for macOS with Apple Silicon <br />
Windows & Linux support coming soon.
</p>
</div>
</section>
</main>
</>
)
}