Temporal gives you flexibility to define different task queues to route workflows and activities to specific workers.
When a worker starts up, it is configured to consume from a specific task queue by name, along with the activities and workflows it is capable of running.
For example:
import asyncio
import concurrent.futures
from activities import my_good_activity
from temporalio.client import Client
from temporalio.worker import Worker
from workflows import MyGoodWorkflow
async def main():
client = await Client(...)
with concurrent.futures.ThreadPoolExecutor(max_workers=100) as activity_executor:
worker = Worker(
client,
task_queue="my-task-queue",
workflows=[MyGoodWorkflow],
activities=[my_good_activity],
activity_executor=activity_executor,
)
await worker.run()
if __name__ == "__main__":
print("Starting worker")
asyncio.run(main())
Let’s say we wanted to execute the workflows using one task queue and the activities with another.
We could write two separate workers, like this.
I run a lot of different version of various languages and tools across my system.
Nix and direnv help make this possible to manage reasonably.
Recently, starting a new Python project, I was running into this warning after install dependencies with pip (yes, I am aware there are new/fresh/fast/cool ways to install dependencies in Python but that is what this project currently uses).
WARNING: There was an error checking the latest version of pip.
It turned out the file in ~/Library/Caches/pip/selfcheck
was corrupted.
Removing the directory and reinstalling pip
fixed the warning.
On macOS, a Launch Agent is a system daemon that runs in the background and performs various tasks or services for the user.
Having recently installed ollama
, I’ve been playing around with various local models.
One annoyance about having installed ollama
using Nix via nix-darwin, is that I need to run ollama serve
in a terminal session or else I would see something like this:
❯ ollama list
Error: could not connect to ollama app, is it running?
After some code searching, I discovered a method to create a Launch Agent plist for my user using nix-darwin
.
This allows ollama serve
to run automatically in the background for my user.
Here’s what it looks like:
I’ve been familiar with Python’s -m
flag for a while but never had quite internalized what it was really doing.
While reading about this cool AI pair programming project called aider
, the docs mentioned that the tool could be invoked via python -m aider.main
“[i]f your pip install did not place the aider executable on your path”.
I hadn’t made this association between pip installed executables and the -m
flag.
The source for the file that runs when that Python command is invoked can be found here.
I tried running the following in a project that had the llm
tool installed and things began to make more sense
I was pulling the openai/evals
repo and trying to running some of the examples.
The repo uses git-lfs
, so I installed that to my system using home-manager
.
{ config, pkgs, ... }:
let
systemPackages = with pkgs; [
# ...
git-lfs
# ...
];
in
{
programs.git = {
enable = true;
lfs.enable = true;
# ...
};
};
After applying these changes, I could run
git lfs install
git lfs pull
to populate the jsonl
files in the repo and run the examples.
I spent yesterday and today working through the excellent guide by Alex on using sqlite-vss
to do vector similarity search in a SQLite database.
I’m particularly interested in the benefits one can get from having these tools available locally for getting better insights into non-big datasets with a low barrier to entry.
Combining this plugin with a tool like datasette
gives you a powerful data stack nearly out of the box.
Installing the sqlite-vss extension
The ergonomics of installing and loading vector0.dylib
and vss0.dylib
are a little unusual.
When pip install
ing sqlite_vss
, the extension can be loaded via
The standard SQLite shell on macOS doesn’t support arrow key navigation like many standard CLI programs do.
Pressing up, down, right, and left in that order outputs the following escape codes in the shell
A program called rlwrap
can shim arrow key support into sqlite
.
Install rlwrap
(it’s supported by Homebrew and Nixpkgs) then run
rlwrap sqlite <the rest of the command>
and it should just work.
I use Simon’s llm
to quickly run LLM prompts.
This package is easily installed with brew
or pip
, so if you want to use it, I recommend those approaches.
The following approach is not for the faint of heart and assumes a bit of familiarity with Nix and home-manager.
We are going to install the llm
including the llm-mistral
plugin using Nix.
It’s not particularly straightforward, but if you want to manage this tool with Nix, it appears to be possible.
I’ve been meaning to try out Simon’s llm
package for a while now.
From reading the docs and following the development, it’s a modular, meet-you-where-you-are CLI for running LLM inference locally or using almost any API out there.
In the past, I might have installed this with brew
, but we run nix
over here now so everything is harder first, then reproducible.
The llm
package/cli is available as a few different nixpkgs
Devbox, is an interesting, nix-based tool for setting up reproducible development environments.
I recently needed to quickly setup a postgres database and load the Chinook dataset to play around with some queries.
I could have used Docker, but I am not a fan of its UI or how heavyweight it has become (looking into podman
is also on my todo list) and I’ve been using nix a lot lately, which is what led me to the devbox project.
After installing devbox
, I setup a project