🦍 Gorilla: Large Language Model Connected with Massive APIs

Shishir G. Patil*, Tianjun Zhang*

UC Berkeley
sgp@berkeley.edu, tianjunz@berkeley.edu

Blogs GitHub HuggingFace Discord

An API Appstore for LLMs

Gorilla Used at

Gorilla OpenFunctions

Teach LLM tool use

Image 1

🎁In OpenFunctions-v2, we natively train the model to support parallel functions (generate multiple functions at a time) and multiple functions (select one or more functions). Java/REST/Python APIs are also supported for the first time with extended data typesπŸ“·

BFCL

Benchmarking LLMs on function calling capabilities

Image 1

πŸ† Berkeley Function-Calling Leaderboard (BFCL) πŸ“Š aims to provide a thorough study of the function-calling capability of different LLMs. It consists of 2k πŸ“ question-function-answer pairs with multiple languages (🐍 Python, β˜• Java, 🟨 JavaScript, 🌐 REST API), diverse application domains, and complex use cases (multiple and parallel function calls). We also investigate function relevance detection πŸ•΅οΈβ€β™‚οΈ, to determine how the model will react when the provided function is not suitable to answer the user's question.

RAFT

Better way to do RAG

Image 1

RAFT: Retriever-Aware FineTuning for domain-specific RAG πŸš€ Drawing parallels between LLMs and students in open-book (RAG) πŸ“” and closed-book exams (SFT) 🧠, we present a better recipe for fine-tuning a base LLM for RAG-focused challenges. Discover how RAFT prepares LLMs to excel with a specific document set, mirroring students' prep for finals! πŸŽ“

GoEX

Runtime for executing LLM-generated actions

Image 1

Gorilla Execution Engine (GoEX) is a runtime for LLM-generated actions like code, API calls, and more. Featuring "post-facto validation" for assessing LLM actions after execution πŸ” Key to our approach is "undo" πŸ”„ and "damage confinement" abstractions to manage unintended actions & risks. This paves the way for fully autonomous LLM agents, enhancing interaction between apps & services with human-out-of-loopπŸš€

News

πŸš’ GoEx: A Runtime for executing LLM generated actions like code & API calls GoEx presents β€œundo” and β€œdamage confinement” abstractions for mitigating the risk of unintended actions taken in LLM-powered systems. Release blog, Paper.
πŸŽ‰ Berkeley Function Calling Leaderboard! How do models stack up for function calling? 🎯 Read more in our Release Blog.
πŸ† Gorilla OpenFunctions v2 Sets new SoTA for open-source LLMs πŸ’ͺ On-par with GPT-4 πŸ™Œ Supports more languages πŸ‘Œ
πŸ₯‡ Gorilla OpenFunctions! Drop in replacement! Examples
πŸš€ Try Gorilla in 60s! No sign-ups, no installs, just colab!
🀩 With Apache 2.0 licensed LLM models, you can use Gorilla commercially without any obligations!
πŸ“£ We are excited to hear your feedback and we welcome API contributions as we build this open-source project. Join us on Discord or feel free to email us!

Gorilla for your CLI and Spotlight Search

Gorilla powered CLI
Get started with pip install gorilla-cli

Gorilla Powered Spotlight Search
Gorilla-Spotlight Signup

Vision

Gorilla LLM logo
Rather have the user at the center, Gorilla enables users to interact with a wide range of services through LLMs. Gorilla is an open-source, state-of-the-art LLM that invokes API calls to interact with services!

Contact Us

Citation

                            
@article{patil2023gorilla,
    title={Gorilla: Large Language Model Connected with Massive APIs},
    author={Shishir G. Patil and Tianjun Zhang and Xin Wang and Joseph E. Gonzalez},
    journal={arXiv preprint arXiv:2305.15334},
    year={2023},
}