Question: if you have a large dataset and want to turn it into an API where developers can call to query it and you can charge them for such access, is there a "standard" easy way to do this?
The fantasy would be along the lines of "here's the DB name and here's a CSV file, go make me a: dev portal / signup / billing / usage dashboard with auto payment integration / API key management / code samples in multiple languages"
1. On *impact* we could think about how quickly a pull request lands on master, how long it takes a new dev to start landing commits, how quickly tickets get closed after being claimed.
2. On *investment* we could look at dollars and time spent on DevEx as a proportion of engineers at the company.
Finally we could subjectively measure dev sentiment on experience (CSAT) to see if the work is being qualitatively felt.
I suspect that one of the most powerful points of leverage for any tech company - in a downturn or no - is excellent developer experience.
Great dev tooling makes it a joy to work on a codebase, increases the speed at which new hires can become effective, and increases retention. This means the company can execute faster and adapt more quickly, too.
That means we should think about objective metrics to measure DevEx by both impact and investment. (to be continued)
It turns out the best way to do AI prompt engineering is with AI, surprise!
Some thoughts on setting up Mastodon; when I tried to manually configure it on my own Ubuntu 22 server I got into config hell because older Ruby choked on OpenSSL and newer Ruby that fixes it isn't compatible with latest Mastodon. DigitalOcean's droplet uses a much older Ubuntu and handpicked a working set of versions. I crashed the instance OOM with only 1GB (Redis was dying) so had to resize to a 2GB droplet. 1 vCPU actually seems fine for my needs so far.