Datasette 1.0a17 is the latest Datasette 1.0 alpha release, with bug fixes and small feature improvements from the last few months.
Python 3.13 was released today. Datasette 1.0a16 is compatible with Python 3.13, but Datasette 0.64.8 was not. The new Datasette 0.65 release fixes compatibility with the new version of Python.
Datasette 1.0a14 includes some breaking changes to how metadata works for plugins, described in detail in the new upgrade guide. See also the annotated release notes that accompany this release.
Datasette 1.0a10 is a focused alpha that changes some internal details about how Datasette handles transactions. The datasette.execute_write_fn()
internal method now wraps the function in a database transaction unless you pass transaction=False
.
Datasette 1.0a9 adds basic alter table support to the JSON API, tweaks how permissions works and introduces some new plugin debugging utilities.
Datasette 1.0a8 introduces several new plugin hooks, a JavaScript plugin system and moves plugin configuration from metadata.yaml
to datasette.yaml
. Read more about the release in the annotated release notes for 1.0a8.
Datasette Enrichments is a new feature for Datasette that supports enriching data by running custom code against every selected row in a table. Read Datasette Enrichments: a new plugin framework for augmenting your data for more details, plus a video demo of enrichments for geocoding addresses and processing text and images using GPT-4.
datasette-comments is a new plugin by Alex Garcia which adds collaborative commenting to Datasette. Alex built the plugin for Datasette Cloud, but it's also available as an open source package for people who are hosting their own Datasette instances. See Annotate and explore your data with datasette-comments on the Datasette Cloud blog for more details.
Datasette 1.0a4 has a fix for a security vulnerability in the Datasette 1.0 alpha series: the API explorer interface exposed the names of private databases and tables in public instances that were protected by a plugin such as datasette-auth-passwords, though not the actual content of those tables. See the security advisory for more details and workarounds for if you can't upgrade immediately. The latest edition of the Datasette Newsletter also talks about this issue.
datasette-write-ui: a Datasette plugin for editing, inserting, and deleting rows introduces a new plugin adding add/edit/delete functionality to Datasette, developed by Alex Garcia. Alex built this for Datasette Cloud, and this post is the first announcement made on the new Datasette Cloud blog - see also Welcome to Datasette Cloud.
Datasette 1.0a3 is an alpha release of Datasette that previews the new default JSON API design that’s coming in version 1.0 - the single most significant change planned for that 1.0 release.
New tutorial: Data analysis with SQLite and Python. This tutorial, originally presented at PyCon 2023, includes a 2h45m video and an extensive handout that should be useful with or without the video. Topics covered include Python's sqlite3
module, sqlite-utils
, Datasette, Datasette Lite, advanced SQL patterns and more.
I built a ChatGPT plugin to answer questions about data hosted in Datasette describes a new experimental Datasette plugin to enable people to query data hosted in a Datasette interface via ChatGPT, asking human language questions that are automatically converted to SQL and used to generate a readable response.
Using Datasette in GitHub Codespaces is a new tutorial showing how Datasette can be run in GitHub's free Codespaces browser-based development environments, using the new datasette-codespaces plugin.
Examples of sites built using Datasette now includes screenshots of Datasette deployments that illustrate a variety of problems that can be addressed using Datasette and its plugins.
Tool support is finally here! This release adds support exposing tools to LLMs, previously described in the release notes for 0.26a0 and 0.26a1.
Read Large Language Models can run tools in your terminal with LLM 0.26 for a detailed overview of the new features.
Also in this release:
llm_version()
and llm_time()
. #1096, #1103tool_instances
table records details of Toolbox instances created while executing a prompt. #1089llm.get_key()
is now a documented utility function. #1094Hopefully the last alpha before a stable release that includes tool support.
llm chat
.
AsyncModel
can now run tools, including tool functions defined as async def
. This enables non-blocking tool calls for potentially long-running operations. (#1063)llm chat
now supports adding fragments during a session.!fragment <id>
command while chatting to insert content from a fragment. Initial fragments can also be passed to llm chat
using -f
or --sf
. Thanks, Dan Turkel. (#1044, #1048)llm logs
by tools.
llm schemas list
can output JSON.--json
and --nl
(newline-delimited JSON) options to llm schemas list
for programmatic access to saved schema definitions. (#1070)llm similar
results by ID prefix.--prefix
option for llm similar
allows searching for similar items only within IDs that start with a specified string (e.g., llm similar my-collection --prefix 'docs/'
). Thanks, Dan Turkel. (#1052)--chain-limit <N>
(or --cl
) option for llm prompt
and llm chat
to specify the maximum number of consecutive tool calls allowed for a single prompt. Defaults to 5; set to 0 for unlimited. (#1025)llm plugins --hook <NAME>
option.llm tools list
now shows toolboxes and their methods. (#1013)llm prompt
and llm chat
now automatically re-enable plugin-provided tools when continuing a conversation (-c
or --cid
). (#1020)--tools-debug
option now pretty-prints JSON tool results for improved readability. (#1083)LLM_TOOLS_DEBUG
environment variable to permanently enable --tools-debug
. (#1045)llm chat
sessions now correctly respect default model options configured with llm models set-options
. Thanks, André Arko. (#985)--pre
option for llm install
to allow installing pre-release packages. (#1060)gpt-4o
, gpt-4o-mini
) now explicitly declare support for tools and vision. (#1037)supports_tools
parameter is now supported in extra-openai-models.yaml
. Thanks, Mahesh Hegde . (#1068)This is the first alpha to introduce support for tools! Models with tool capability (which includes the default OpenAI model family) can now be granted access to execute Python functions as part of responding to a prompt.
Tools are supported by the command-line interface:
llm --functions '
def multiply(x: int, y: int) -> int:
"""Multiply two numbers."""
return x * y
' 'what is 34234 * 213345'
And in the Python API, using a new model.chain()
method for executing multiple prompts in a sequence:
import llm
def multiply(x: int, y: int) -> int:
"""Multiply two numbers."""
return x * y
model = llm.get_model("gpt-4.1-mini")
response = model.chain(
"What is 34234 * 213345?",
tools=[multiply]
)
print(response.text())
New tools can also be defined using the register_tools() plugin hook. They can then be called by name from the command-line like this:
llm -T multiply 'What is 34234 * 213345?'
Tool support is currently under active development. Consult this milestone for the latest status.
INSERT ... ON CONFLICT SET
syntax on all SQLite versions later than 3.23.1. This is a very slight breaking change for apps that depend on the previous INSERT OR IGNORE
followed by UPDATE
behavior. (#652)use_old_upsert=True
to the Database()
constructor, see Alternative upserts using INSERT OR IGNORE.sqlite-utils tui
is now provided by the sqlite-utils-tui plugin. (#648)INSERT ... ON CONFLICT SET
syntax was added. (#654)gpt-4.1
, gpt-4.1-mini
, gpt-41-nano
, o3
, o4-mini
. #945, #965, #976.LLM_MODEL
and LLM_EMBEDDING_MODEL
for setting the model to use without needing to specify -m model_id
every time. #932llm fragments loaders
, to list all currently available fragment loader prefixes provided by plugins. #941llm fragments
command now shows fragments ordered by the date they were first used. #973llm chat
now includes a !edit
command for editing a prompt using your default terminal text editor. Thanks, Benedikt Willi. #969-t
and --system
to be used at the same time. #916llm -c/--continue
now works correctly with the -d/--database
option. llm chat
now accepts that -d/--database
option. Thanks, Sukhbinder Singh. #933prepare_connection()
hook no longer runs for the internal database. #2468link:
HTTP headers used invalid syntax. #2470llm models --options
now shows keys and environment variables for models that use API keys. Thanks, Steve Morin. #903py.typed
marker file so LLM can now be used as a dependency in projects that use mypy
without a warning. #887$
characters can now be used in templates by escaping them as $$
. Thanks, @guspix. #904pyproject.toml
instead of setup.py
. #908