I found a very satisfying, time-saving use of a model though Cursor to generate an inline shell script to reformat some data for me.
The goal was to reformat json files that looked like this
to ones that look like this
I used the following prompts:
for each json file in the current folder, read the json (which is a json list) and turn it into a json object with a single key, “cells” whose value is the json list
Matthew wrote a thread summarizing Apple’s private cloud and their security approach.
Between the speed at which models are changing and their size, it’s not currently practical to run things like LLM inference on iPhones, at least not for the best available models.
These models will probably always be larger or require more compute than what handheld devices are capable of.
If you want to use the best models, you’ll need to offload inference to a data center.
I spent some time moving my bookmarks and workflows to Raindrop from Pocket.
I’m mostly done with a short post on how and why I did that.
I’d like to use the python-raindropio
package to generate stubs for logs posts from links I bookmark, including comments I write about them.
The Nix flake I use for this blog hampered those efforts.
The python-raindropio
package isn’t available as a Nix package and I’m not particularly well versed in building my own packages with Nix, so that became a bit of a rabbit hole.
After poking around for other available libraries, it turns out, this one isn’t even maintained anymore.
It was forked as raindrop-io-py
and it looks like that might not even be maintained anymore.
Not the greatest sign, but the package does still work for the time being.
I was away from the computer for a couple of weeks.
That was really nice.
During my downtime and in transit, these were some of my favorite things I read in the past two weeks.
I’m planning to come up with a bit of a better system for logging and adding thoughts to stuff I read but for now a list will have to do.
Sabrina wrote an interesting
write up on solving a math
problem with gpt-4o
. It turned out the text-only, chain-of-thought approach
was the best performing, which is not what I would have guessed.
It’s was cool to see Simon dive into LLM-driven data extraction in using his
project datasette
in
this video. Using multi-modal
models for data extraction seems to bring a new level of usefulness and makes
these models even more general purpose.
Nostalgia: https://maggieappleton.com/teenage-desktop.
I wish I had done something like this.
Maybe I can find something on an old hard drive.
I’m looking into creating a Deno serve that can manage multiple websocket connections and emit to one after receiving a message from another.
A simple way to implement this is to have a single server id and track all the ongoing connections to websocket clients.
I’m learning more about approaches that could support a multi-server backend.
I take an irrational amount of pleasure in disabling notifications for apps that use them to send me marketing.
I enjoyed reading Yuxuan’s article on whether Github Copilot increased their productivity.
I personally don’t love Copilot but enjoy using other AI-assisted software tools like Cursor, which allow for use of more capable models than Copilot.
It’s encouraging to see more folks adopting a more unfiltered thought journal.
I read this post by Steph today and loved it.
I want to try writing this concisely.
I imagine it takes significant effort but the result are beautiful, satisfying and valuable.
It’s a privilege to read a piece written by someone who values every word.