structuinput – a lightweight Python package that converts unstructured user inputs (natural‑language descriptions, queries, or specifications) into structured, machine‑readable outputs such as API request templates, configuration files, or data schemas. It leverages llmatch‑messages together with a default LLM (ChatLLM7) to guarantee that the generated text matches a predefined regular‑expression pattern, making the result ready for direct integration.
- Installation
- Quick Start
- Function Signature
- Using the Default LLM (ChatLLM7)
- Providing Your Own LLM
- Environment Variables & API Keys
- Rate Limits & Quotas
- Troubleshooting & Errors
- Contributing & Issues
- License
- Author
pip install structuinputfrom structuinput import structuinput
# Example unstructured description
user_input = """
I need an endpoint to upload a user avatar.
It should accept a multipart/form‑data body with a field called `image`,
return JSON with `url` and `size`, and require an `Authorization` header.
"""
# Use the default ChatLLM7 (API key taken from env or fallback)
responses = structuinput(user_input)
print(responses) # → List of strings that match the defined output patternThe function returns a list of strings that already conform to the regular‑expression pattern defined in structuinput.prompts.pattern.
def structuinput(
user_input: str,
api_key: Optional[str] = None,
llm: Optional[BaseChatModel] = None,
) -> List[str]:
"""
Convert free‑form text into a structured output.
Parameters
----------
user_input: str
The free‑form description or query to be transformed.
llm: Optional[BaseChatModel]
A LangChain chat model instance. If omitted, the package creates a
`ChatLLM7` instance internally.
api_key: Optional[str]
API key for the LLM7 service. If omitted the function will read the
`LLM7_API_KEY` environment variable. When both are missing a placeholder
key `"None"` is used (suitable only for testing).
Returns
-------
List[str]
A list of extracted strings that satisfy the output pattern.
"""structuinput ships with ChatLLM7 (from the langchain_llm7 package) as the built‑in language model.
from structuinput import structuinput
response = structuinput(
user_input="Create a JSON config for a Redis cache with host, port, and db index."
)
print(response)If an API key is required, set it in your environment:
export LLM7_API_KEY="your_llm7_api_key"or pass it directly:
response = structuinput(user_input, api_key="your_llm7_api_key")You can obtain a free key by registering at https://token.llm7.io/. The free tier’s rate limits are sufficient for most development and prototyping scenarios.
You may replace the default model with any LangChain‑compatible chat model, e.g., OpenAI, Anthropic, or Google Gemini.
from langchain_openai import ChatOpenAI
from structuinput import structuinput
llm = ChatOpenAI(model="gpt-4o-mini")
response = structuinput("Describe a PostgreSQL connection string.", llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from structuinput import structuinput
llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = structuinput("Generate a Terraform module for an S3 bucket.", llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from structuinput import structuinput
llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = structuinput("Write a Kubernetes Deployment YAML for a Node.js app.", llm=llm)
print(response)All custom LLMs must implement the BaseChatModel interface from LangChain.
| Variable | Description |
|---|---|
LLM7_API_KEY |
API key for the default ChatLLM7 service. |
LLM7_BASE_URL |
(Optional) Override the base URL for the LLM7 service. |
If you provide api_key directly to structuinput, it takes precedence over the environment variable.
- ChatLLM7 free tier: generous daily limits suitable for typical development workloads.
- For higher throughput, obtain a paid plan from the LLM7 provider or switch to another LLM (OpenAI, Anthropic, etc.) that matches your quota requirements.
structuinput uses llmatch to enforce that the LLM output matches a regular expression. If the pattern is not satisfied, a RuntimeError is raised:
RuntimeError: LLMS call failedTypical reasons:
- Invalid API key – double‑check
LLM7_API_KEYor the key passed to the function. - Network issues – ensure your environment can reach the LLM endpoint.
- Prompt/Pattern mismatch – adjust your input so the model can generate text aligned with the expected format.
Enable verbose mode (set verbose=True inside the source) for more detailed logs.
Feel free to open bug reports, feature requests, or pull requests:
- GitHub Issues: https://github.com/chigwell/structuinput/issues
Please follow the usual contribution guidelines (tests, documentation, style) when submitting PRs.
Distributed under the MIT License. See the LICENSE file for details.
Eugene Evstafev
✉️ Email: hi@euegne.plus
🐙 GitHub: https://github.com/chigwell
Happy structuring! 🚀