rust explorer w/ textual + pydantic-ai
given that:
- I am learning rust and pyo3
pydantic-ai
makes it trivial to stream LLMs to a chat interface (e.g. textual)uv
makes anything on PyPI accessible to anyone in a snap
I decided to write and publish a (probably unnecessary but fun) TUI that lets you explore a rust project's functions with your favorite streaming capable LLM
If you have uv
and an OPENAI_API_KEY
, the following should (hopefully) just work ™️:
cd path/to/my/rust/project
# try it with openai:gpt-4o (default)
uvx rspyai@latest
# try it with an OSS model via ollama
RSPYAI_AI_MODEL=ollama:qwen2.5 uvx rspyai@latest
... let me know if it doesn't!
but how does it work?
First, we use the syn
crate to define utils for parsing public rust functions from a path
// In Rust, we parse the code and extract public functions from the user's path
#[pyfunction]
fn scan_rust_project(py: Python<'_>, file_path: Option<&str>) -> PyResult<Py<PyList>> {
let functions = PyList::empty(py);
let root_path = match file_path {
Some(p) => Path::new(p),
None => Path::new("src"),
};
// Parse and collect public functions using syn
if let Ok(rust_functions) = ProjectScanner::scan_directory(root_path) {
for func in rust_functions {
let dict = PyDict::new(py);
dict.set_item("name", func.name)?;
dict.set_item("doc", func.doc)?;
dict.set_item("signature", func.signature)?;
// ... more metadata
functions.append(dict)?;
}
}
Ok(functions.into())
}
these rust functions are exposed as a pyo3/extension-module
which can be built and published as python wheels for a bunch of different architectures so the end-user of the TUI can just pip
/uv
install and run rspyai
(even if they do not have rust installed! 🪄 )
#[pymodule]
fn rspyai(m: &Bound<'_, PyModule>) -> PyResult<()> {
m.add_function(wrap_pyfunction!(scan_rust_project, m)?)?;
m.add_function(wrap_pyfunction!(get_function_metadata, m)?)?;
Ok(())
}
I built some textual
widgets leveraging this exported module's utils to discover the user's tree of rust functions and display / summarize details about them
class FunctionBrowser(App[None]):
"""A TUI for browsing Rust functions."""
def compose(self) -> ComposeResult:
yield Header()
with Horizontal():
tree = FunctionTree() # shows function hierarchy
tree.scan_project(self.root_path) # calls our Rust scanner inside
yield tree
yield FunctionDetails() # Shows function details & AI summary
yield Footer()
under the hood of the FunctionDetails
widget, I use pydantic-ai
to stream AI summaries (you can use any LLM provider they support) of the functions' source code into our TUI:
class Settings(BaseSettings):
ai_model: KnownModelName = Field(
default='openai:gpt-4o',
description='OpenAI model to use for function analysis',
)
class FunctionSummaryWidget(Widget):
def generate_summary(self, signature: str, docs: str, source: str) -> None:
settings = get_settings()
agent = Agent(
setttings.ai_model,
system_prompt=settings.ai_system_prompt
)
async with agent.run_stream(f"Analyze:\n{source}") as result:
async for message in result.stream():
self.update_summary(message)
this isn't trying to be such a practical tool - it's more about exploring how these things can potentially work together practically (and an exercise in streaming AI responses!)
Plus, I learned a bit of Rust / pyo3
along the way, which was the original goal anyway. Maybe I'll add the ability to view arbitrary symbols at some point (if I can manage that).
for more, or to contribute / tinker, see the repo!