Skip to content

structuinput transforms natural language inputs into structured outputs like API templates, config files, and data schemas, streamlining development and integration.

Notifications You must be signed in to change notification settings

chigwell/structuinput

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

structuinput

PyPI version License: MIT Downloads LinkedIn

structuinput – a lightweight Python package that converts unstructured user inputs (natural‑language descriptions, queries, or specifications) into structured, machine‑readable outputs such as API request templates, configuration files, or data schemas. It leverages llmatch‑messages together with a default LLM (ChatLLM7) to guarantee that the generated text matches a predefined regular‑expression pattern, making the result ready for direct integration.


Table of Contents


Installation

pip install structuinput

Quick Start

from structuinput import structuinput

# Example unstructured description
user_input = """
I need an endpoint to upload a user avatar. 
It should accept a multipart/form‑data body with a field called `image`,
return JSON with `url` and `size`, and require an `Authorization` header.
"""

# Use the default ChatLLM7 (API key taken from env or fallback)
responses = structuinput(user_input)

print(responses)   # → List of strings that match the defined output pattern

The function returns a list of strings that already conform to the regular‑expression pattern defined in structuinput.prompts.pattern.


Function Signature

def structuinput(
    user_input: str,
    api_key: Optional[str] = None,
    llm: Optional[BaseChatModel] = None,
) -> List[str]:
    """
    Convert free‑form text into a structured output.

    Parameters
    ----------
    user_input: str
        The free‑form description or query to be transformed.
    llm: Optional[BaseChatModel]
        A LangChain chat model instance. If omitted, the package creates a
        `ChatLLM7` instance internally.
    api_key: Optional[str]
        API key for the LLM7 service. If omitted the function will read the
        `LLM7_API_KEY` environment variable. When both are missing a placeholder
        key `"None"` is used (suitable only for testing).

    Returns
    -------
    List[str]
        A list of extracted strings that satisfy the output pattern.
    """

Using the Default LLM (ChatLLM7)

structuinput ships with ChatLLM7 (from the langchain_llm7 package) as the built‑in language model.

from structuinput import structuinput

response = structuinput(
    user_input="Create a JSON config for a Redis cache with host, port, and db index."
)
print(response)

If an API key is required, set it in your environment:

export LLM7_API_KEY="your_llm7_api_key"

or pass it directly:

response = structuinput(user_input, api_key="your_llm7_api_key")

You can obtain a free key by registering at https://token.llm7.io/. The free tier’s rate limits are sufficient for most development and prototyping scenarios.


Providing Your Own LLM

You may replace the default model with any LangChain‑compatible chat model, e.g., OpenAI, Anthropic, or Google Gemini.

OpenAI

from langchain_openai import ChatOpenAI
from structuinput import structuinput

llm = ChatOpenAI(model="gpt-4o-mini")
response = structuinput("Describe a PostgreSQL connection string.", llm=llm)
print(response)

Anthropic

from langchain_anthropic import ChatAnthropic
from structuinput import structuinput

llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = structuinput("Generate a Terraform module for an S3 bucket.", llm=llm)
print(response)

Google Generative AI

from langchain_google_genai import ChatGoogleGenerativeAI
from structuinput import structuinput

llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = structuinput("Write a Kubernetes Deployment YAML for a Node.js app.", llm=llm)
print(response)

All custom LLMs must implement the BaseChatModel interface from LangChain.


Environment Variables & API Keys

Variable Description
LLM7_API_KEY API key for the default ChatLLM7 service.
LLM7_BASE_URL (Optional) Override the base URL for the LLM7 service.

If you provide api_key directly to structuinput, it takes precedence over the environment variable.


Rate Limits & Quotas

  • ChatLLM7 free tier: generous daily limits suitable for typical development workloads.
  • For higher throughput, obtain a paid plan from the LLM7 provider or switch to another LLM (OpenAI, Anthropic, etc.) that matches your quota requirements.

Troubleshooting & Errors

structuinput uses llmatch to enforce that the LLM output matches a regular expression. If the pattern is not satisfied, a RuntimeError is raised:

RuntimeError: LLMS call failed

Typical reasons:

  1. Invalid API key – double‑check LLM7_API_KEY or the key passed to the function.
  2. Network issues – ensure your environment can reach the LLM endpoint.
  3. Prompt/Pattern mismatch – adjust your input so the model can generate text aligned with the expected format.

Enable verbose mode (set verbose=True inside the source) for more detailed logs.


Contributing & Issues

Feel free to open bug reports, feature requests, or pull requests:

Please follow the usual contribution guidelines (tests, documentation, style) when submitting PRs.


License

Distributed under the MIT License. See the LICENSE file for details.


Author

Eugene Evstafev
✉️ Email: hi@euegne.plus
🐙 GitHub: https://github.com/chigwell


Happy structuring! 🚀