Evaggelos Balaskas - System Engineer

The sky above the port was the color of television, tuned to a dead channel

Blog
Posts
Wiki
About
Contact
rss.png twitter linkedin github gitlab profile for ebal on Stack Exchange

Next Page »
  -  
Sep
21
2025
AI Notes: Using LLM + Perplexity from the terminal
Posted by ebal at 18:56:33 in blog

🖥️ I’ve been playing around with the python cli LLM and Perplexity, trying to get a setup that works nicely from the command line. Below are my notes, with what worked, what I stumbled on, and how you can replicate it.


📌 Background & Why

I like working with tools that let me automate or assist me with shell commands, especially when exploring files, searching, or scripting stuff. LLM + Perplexity give me that power: AI suggestions + execution.

If you’re new to this, it helps you avoid googling every little thing, but still keeps you in control.

Also, I have a Perplexity Pro account, and I want to learn how to use it from my Linux command line.


⚙️ Setup: Step by Step

1️⃣ Prepare a Python virtual environment

I prefer isolating things so I don’t mess up my global Python. Here’s how I did it by creating a new python virtual environment and activate it:

PROJECT="llm"

python3 -m venv ~/.venvs/${PROJECT}
source ~/.venvs/${PROJECT}/bin/activate

# Install llm project
pip install -U ${PROJECT}

This gives you a clean llm install.


2️⃣ Get Perplexity API key 🔑

You’ll need an API key from Perplexity to use their model via LLM.

  • Go to Perplexity.ai 🌐

  • Sign in / register

  • Go to your API keys page: https://www.perplexity.ai/account/api/keys

  • Copy your key

    Be careful, in order to get the API, you need to type your Bank Card details. In my account, I have a free tier of 5 USD. You can review your tokens via the Usage metrics in Api Billing section.


3️⃣ Install plugins for LLM 🧩

I used two plugins:

  • ⚡ llm-cmd — for LLM to suggest/run shell commands

  • 🔍 llm-perplexity — so LLM can use Perplexity as a model provider

Commands:

llm install llm-cmd

llm install llm-perplexity

Check what’s installed:

llm plugins

Sample output:

[
  {
    "name": "llm-cmd",
    "hooks": [
      "register_commands"
    ],
    "version": "0.2a0"
  },
  {
    "name": "llm-perplexity",
    "hooks": [
      "register_models"
    ],
    "version": "2025.6.0"
  }
]

4️⃣ Configure your Perplexity key inside LLM 🔐

Tell LLM your Perplexity key so it can use it:

❯ llm keys set perplexity
# then paste your API key when prompted

Verify:

❯ llm keys
perplexity

You should just see “perplexity” listed (or the key name), meaning it is stored.


Available models inside LLM 🔐

Verify and view what are the available models to use:

llm models

the result on my setup, with perplexity enabled is:

OpenAI Chat: gpt-4o (aliases: 4o)
OpenAI Chat: chatgpt-4o-latest (aliases: chatgpt-4o)
OpenAI Chat: gpt-4o-mini (aliases: 4o-mini)
OpenAI Chat: gpt-4o-audio-preview
OpenAI Chat: gpt-4o-audio-preview-2024-12-17
OpenAI Chat: gpt-4o-audio-preview-2024-10-01
OpenAI Chat: gpt-4o-mini-audio-preview
OpenAI Chat: gpt-4o-mini-audio-preview-2024-12-17
OpenAI Chat: gpt-4.1 (aliases: 4.1)
OpenAI Chat: gpt-4.1-mini (aliases: 4.1-mini)
OpenAI Chat: gpt-4.1-nano (aliases: 4.1-nano)
OpenAI Chat: gpt-3.5-turbo (aliases: 3.5, chatgpt)
OpenAI Chat: gpt-3.5-turbo-16k (aliases: chatgpt-16k, 3.5-16k)
OpenAI Chat: gpt-4 (aliases: 4, gpt4)
OpenAI Chat: gpt-4-32k (aliases: 4-32k)
OpenAI Chat: gpt-4-1106-preview
OpenAI Chat: gpt-4-0125-preview
OpenAI Chat: gpt-4-turbo-2024-04-09
OpenAI Chat: gpt-4-turbo (aliases: gpt-4-turbo-preview, 4-turbo, 4t)
OpenAI Chat: gpt-4.5-preview-2025-02-27
OpenAI Chat: gpt-4.5-preview (aliases: gpt-4.5)
OpenAI Chat: o1
OpenAI Chat: o1-2024-12-17
OpenAI Chat: o1-preview
OpenAI Chat: o1-mini
OpenAI Chat: o3-mini
OpenAI Chat: o3
OpenAI Chat: o4-mini
OpenAI Chat: gpt-5
OpenAI Chat: gpt-5-mini
OpenAI Chat: gpt-5-nano
OpenAI Chat: gpt-5-2025-08-07
OpenAI Chat: gpt-5-mini-2025-08-07
OpenAI Chat: gpt-5-nano-2025-08-07
OpenAI Completion: gpt-3.5-turbo-instruct (aliases: 3.5-instruct, chatgpt-instruct)
Perplexity: sonar-deep-research
Perplexity: sonar-reasoning-pro
Perplexity: sonar-reasoning
Perplexity: sonar-pro
Perplexity: sonar
Perplexity: r1-1776
Default: gpt-4o-mini

as of this blog post date written.

🚀 First Use: Asking LLM to Suggest a Shell Command

okay, here is where things get fun.

I started with something simply, identify all files that are larger than 1GB and I tried this prompt:

llm -m sonar-pro cmd "find all files in this local directory that are larger than 1GB"

It responded with something like:

Multiline command - Meta-Enter or Esc Enter to execute
> find . -type f -size +1G -exec ls -lh {} ;

  ## Citations:
  [1] https://tecadmin.net/find-all-files-larger-than-1gb-size-in-linux/
  [2] https://chemicloud.com/kb/article/find-and-list-files-bigger-or-smaller-than-in-linux/
  [3] https://manage.accuwebhosting.com/knowledgebase/3647/How-to-Find-All-Files-Larger-than-1GB-in-Linux.html
  [4] https://hcsonline.com/support/resources/blog/find-files-larger-than-1gb-command-line

Aborted!

I did not want to execute this, so I interrupted the process.

💡 Tip: Always review AI-suggested commands before running them — especially if they involve find /, rm -rf, or anything destructive.


📂 Example: Running the command manually

If you decide to run manually, you might do:

find . -xdev -type f -size +1G -exec ls -lh {} ;

My output was like:

-rw-r--r-- 1 ebal ebal 3.5G Jun  9 11:20 ./.cache/colima/caches/9efdd392c203dc39a21e37036e2405fbf5b0c3093c55f49c713ba829c2b1f5b5.raw
-rw-r--r-- 1 ebal ebal 13G Jun  9 11:58 ./.local/share/rancher-desktop/lima/0/diffdisk

Cool way to find big files, especially if disk is filling up 💾.


🤔 Things I Learned / Caveats

  • ⚠️ AI-suggested commands are helpful, but sometimes they assume things (permissions, paths) that I didn’t expect.

  • 🐍 Using a virtual env helps avoid version mismatches.

  • 🔄 The plugins sometimes need updates; keep track of version changes.

  • 🔑 Be careful with your API key — don’t commit it anywhere.


✅ Summary & What’s Next

So, after doing this:

  • 🛠️ Got llm working with Perplexity

  • 📜 Asked for shell commands

  • 👀 Reviewed + tested output manually

Next, I would like to run Ollama in my home lab. I don’t have a GPU yet, so I’ll have to settle for Docker on an old CPU, which means things will be slow and require some patience. I also want to play around with mixing an LLM and tools like Agno framework to set up a self-hosted agentic solution for everyday use.


That’s it !

PS. These are my personal notes from my home lab; AI was used to structure and format the final version of this blog post.

  -  

Search

Admin area

  • Login

Categories

  • blog
  • wiki
  • pirsynd
  • midori
  • books
  • archlinux
  • movies
  • xfce
  • code
  • beer
  • planet_ellak
  • planet_Sysadmin
  • microblogging
  • UH572
  • KoboGlo
  • planet_fsfe

Archives

  • 2025
    • September
    • April
    • March
    • February
  • 2024
    • November
    • October
    • August
    • April
    • March
  • 2023
    • May
    • April
  • 2022
    • November
    • October
    • August
    • February
  • 2021
    • November
    • July
    • June
    • May
    • April
    • March
    • February
  • 2020
    • December
    • November
    • September
    • August
    • June
    • May
    • April
    • March
    • January
  • 2019
    • December
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2018
    • December
    • November
    • October
    • September
    • August
    • June
    • May
    • April
    • March
    • February
    • January
  • 2017
    • December
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2016
    • December
    • November
    • October
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2015
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • January
  • 2014
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2013
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2012
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2011
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2010
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
  • 2009
    • December
    • November
    • October
    • September
    • August
    • July
    • June
    • May
    • April
    • March
    • February
    • January
Ευάγγελος.Μπαλάσκας.gr

License GNU FDL 1.3 - CC BY-SA 3.0