Skip to main content

llama.cpp bindings for JavaScript

Works with
This package works with Deno
JSR Score
58%
Published
4 weeks ago (0.0.3)

libllama

Run llama.cpp from Deno

import { Llama } from "jsr:@divy/libllama";
import process from "node:process";

const engine = new Llama({
  model: process.argv[2] || "./models/llama-2-7b-chat.Q2_K.gguf",
});

const text = process.argv[3];
engine.predict(text, {
  tokenCallback: (token) => {
    process.stdout.write(token);
    return true;
  },
});

Building

Make sure to clone the repository with submodules and install prerequisites for building llama.cpp.

Build deno-llama using cmake:

mkdir build
cd build
cmake ..
make

Run the example:

export LIBLLAMA_PATH=../build/libllama.dylib
deno run --allow-ffi example.ts \
    ./models/llama-2-7b-chat.Q2_K.gguf \
    "What is the meaning of life?"

License

This project is licensed under the MIT License.

Thanks

Thanks to the authors of llama.cpp and go-llama. A lot of the code in this repository is based on their work.

Add Package

deno add @divy/libllama

Import symbol

import * as mod from "@divy/libllama";