Introducing Gradio ClientsJoin us on Thursday, 9am PST

Livestream

New to Gradio? Start here: Getting Started

See the Release History

To install Gradio from main, run the following command:

pip install https://gradio-builds.s3.amazonaws.com/9898c617676ca0f7d8d6c1961e4bd498be2b2c11/gradio-4.36.1-py3-none-any.whl

*Note: Setting share=True in launch() will not work.

load

gradio.load(ยทยทยท)

Description

Constructs a demo from a Hugging Face repo. Can accept model repos (if src is "models") or Space repos (if src is "spaces"). The input and output components are automatically loaded from the repo. Note that if a Space is loaded, certain high-level attributes of the Blocks (e.g. custom css, js, and head attributes) will not be loaded.

Example Usage

import gradio as gr
demo = gr.load("gradio/question-answering", src="spaces")
demo.launch()

Initialization

Parameter Description
name

str

required

the name of the model (e.g. "gpt2" or "facebook/bart-base") or space (e.g. "flax-community/spanish-gpt2"), can include the src as prefix (e.g. "models/facebook/bart-base")

src

str | None

default: None

the source of the model: models or spaces (or leave empty if source is provided as a prefix in name)

hf_token

str | None

default: None

optional access token for loading private Hugging Face Hub models or spaces. Find your token here: https://huggingface.co/settings/tokens. Warning: only provide this if you are loading a trusted private Space as it can be read by the Space you are loading.

alias

str | None

default: None

optional string used as the name of the loaded model instead of the default name (only applies if loading a Space running Gradio 2.x)