Trending

Latest Posts by Olko Koval

Metrics I care about for promptctl: "did someone run it twice?" and "did they set a custom memory dir?" Usage > stars. Building in public, one release at a time.

16 hours ago 0 0 0 0

You can try promptctl at prompt-ctl.com without installing anything: paste a prompt, hit enhance, see the result. Sign in with GitHub or Google for 10 free tries.

Still free. Still no sign-up required to read docs or install the CLI.

16 hours ago 0 0 0 0

Built promptctl because I was tired of pasting "you are an expert X, do Y, output Z" into every chat. Now it's `promptctl enhance --template=review`. Templates live in ~/.promptctl. Same idea, 10x less copy-paste.

16 hours ago 0 0 0 0

Why a single binary for promptctl? No npm install, no pip, no venv. You're in the terminal to fix a prompt — you run one command. Friction kills adoption. One binary, one install, done.

16 hours ago 0 0 0 0

Most LLM cost problems aren't about model choice — they're about prompt structure. Unstructured prompts cost 2-3x in follow-up calls. We built a CLI that structures them. ~67% fewer tokens. Same quality.

prompt-ctl.com

16 hours ago 0 0 0 0

Metrics I care about for promptctl: "did someone run it twice?" and "did they set a custom memory dir?" Usage > stars. Building in public, one release at a time.

1 day ago 0 0 0 0

You can try promptctl at prompt-ctl.com without installing anything: paste a prompt, hit enhance, see the result. Sign in with GitHub or Google for 10 free tries.

Still free. Still no sign-up required to read docs or install the CLI.

1 day ago 0 0 0 0

Built promptctl because I was tired of pasting "you are an expert X, do Y, output Z" into every chat. Now it's `promptctl enhance --template=review`. Templates live in ~/.promptctl. Same idea, 10x less copy-paste.

1 day ago 0 0 0 0

Why a single binary for promptctl? No npm install, no pip, no venv. You're in the terminal to fix a prompt — you run one command. Friction kills adoption. One binary, one install, done.

1 day ago 0 0 0 0

Most LLM cost problems aren't about model choice — they're about prompt structure. Unstructured prompts cost 2-3x in follow-up calls. We built a CLI that structures them. ~67% fewer tokens. Same quality.

prompt-ctl.com

1 day ago 0 0 0 0
Advertisement

Metrics I care about for promptctl: "did someone run it twice?" and "did they set a custom memory dir?" Usage > stars. Building in public, one release at a time.

2 days ago 0 0 0 0

You can try promptctl at prompt-ctl.com without installing anything: paste a prompt, hit enhance, see the result. Sign in with GitHub or Google for 10 free tries.

Still free. Still no sign-up required to read docs or install the CLI.

2 days ago 0 0 0 0

Built promptctl because I was tired of pasting "you are an expert X, do Y, output Z" into every chat. Now it's `promptctl enhance --template=review`. Templates live in ~/.promptctl. Same idea, 10x less copy-paste.

2 days ago 0 0 0 0

Why a single binary for promptctl? No npm install, no pip, no venv. You're in the terminal to fix a prompt — you run one command. Friction kills adoption. One binary, one install, done.

2 days ago 0 0 0 0

Most LLM cost problems aren't about model choice — they're about prompt structure. Unstructured prompts cost 2-3x in follow-up calls. We built a CLI that structures them. ~67% fewer tokens. Same quality.

prompt-ctl.com

2 days ago 0 0 0 0

Metrics I care about for promptctl: "did someone run it twice?" and "did they set a custom memory dir?" Usage > stars. Building in public, one release at a time.

3 days ago 0 0 0 0

You can try promptctl at prompt-ctl.com without installing anything: paste a prompt, hit enhance, see the result. Sign in with GitHub or Google for 10 free tries.

Still free. Still no sign-up required to read docs or install the CLI.

3 days ago 0 0 0 0

Built promptctl because I was tired of pasting "you are an expert X, do Y, output Z" into every chat. Now it's `promptctl enhance --template=review`. Templates live in ~/.promptctl. Same idea, 10x less copy-paste.

3 days ago 0 0 0 0

Why a single binary for promptctl? No npm install, no pip, no venv. You're in the terminal to fix a prompt — you run one command. Friction kills adoption. One binary, one install, done.

3 days ago 0 0 0 0

Most LLM cost problems aren't about model choice — they're about prompt structure. Unstructured prompts cost 2-3x in follow-up calls. We built a CLI that structures them. ~67% fewer tokens. Same quality.

prompt-ctl.com

3 days ago 0 0 0 0
Advertisement

Okay so here’s my actual Bluesky-is-dying hypothesis:

The entire web is dying. Users aren’t going from BlueSky to another site (x/insta/threada/tiktok). Users are going to chatbots.

I know traffic to news sites has cratered (like 90%). My hunch is traffic to all the social platforms is down too.

4 days ago 3788 505 200 168
A black and white soccer ball with handwritten numbers resting on a crumpled red plastic tarp, urban documentary photography.

A black and white soccer ball with handwritten numbers resting on a crumpled red plastic tarp, urban documentary photography.

Random things left behind in the city can form such visually pleasing, quiet compositions. Finding unexpected color and texture on the streets of The Hague.

Fujifilm xt5
Fujinon 23mm 1.4

#xt5 #fujifilmxt5 #fujinon23mm14

4 days ago 0 0 0 0

Lovely colors and composition!

4 days ago 1 0 1 0
Swans chilling in The Hague park

Swans chilling in The Hague park

Finally warm in The Hague 🦢

4 days ago 0 0 0 0

Metrics I care about for promptctl: "did someone run it twice?" and "did they set a custom memory dir?" Usage > stars. Building in public, one release at a time.

4 days ago 0 0 0 0

You can try promptctl at prompt-ctl.com without installing anything: paste a prompt, hit enhance, see the result. Sign in with GitHub or Google for 10 free tries.

Still free. Still no sign-up required to read docs or install the CLI.

4 days ago 0 0 0 0

Built promptctl because I was tired of pasting "you are an expert X, do Y, output Z" into every chat. Now it's `promptctl enhance --template=review`. Templates live in ~/.promptctl. Same idea, 10x less copy-paste.

4 days ago 0 0 0 0

Why a single binary for promptctl? No npm install, no pip, no venv. You're in the terminal to fix a prompt — you run one command. Friction kills adoption. One binary, one install, done.

4 days ago 0 0 0 0
Advertisement

Most LLM cost problems aren't about model choice — they're about prompt structure. Unstructured prompts cost 2-3x in follow-up calls. We built a CLI that structures them. ~67% fewer tokens. Same quality.

prompt-ctl.com

4 days ago 0 0 0 0

#docker #git #devops #automation

5 days ago 0 0 0 0