GPT-OSS-120B is a powerful open-source alternative to GPT models and offers many modern capabilities such as tool calling, advanced context processing and flexible customization options. Thanks to our OpenAI-compatible API, our Managed AI model can be integrated directly into existing applications without any major conversions or special customizations.
As we operate GPT-OSS-120B on our own infrastructure in Germany, you have a GDPR-compliant, reliable and scalable environment at your disposal. This means you can use the model wherever GPT-3.5, GPT-4 or similar systems were previously used, with maximum flexibility and full functionality.
In this article, I’ll show you a few ways in which the model can be easily integrated.
Integration in development environments (IDE)
Many IDEs such as JetBrains or Visual Studio Code already support OpenAI-compatible AI assistants.
Using plugins such as Roo Code or Cline, you can use our Managed AI model directly via its API with minimal configuration.
Integrating it is straightforward:
- Install plugin
- Configure your own API URL
- Insert API key
- Get started
Since these plugins support the OpenAI format, our Managed AI Model works without additional customization, perfect for autocomplete, debugging, code refactoring and documentation.
Workflow automation with n8n
The no-code tool n8n is ideal for integrating GPT-OSS into business applications. n8n natively supports OpenAI-compatible interfaces, so you can get started right away.
Typical areas of application:
- Automated e-mail summaries
- Classification of text and support requests
- Chatbot flows and customer dialogs
- Content generation for marketing teams
- And much more
With the OpenAI Node or a simple HTTP Request Node, our Managed AI model can be connected within a few minutes.
Integration in LangChain Agents & RAG workflows
For developers who want to build more complex systems, LangChain is the first choice.
Thanks to API compatibility, you can use GPT-OSS via existing LangChain classes, for example:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="openai/gpt-oss-120b",
openai_api_base="https://api.ai.nws.netways.de/v1",
openai_api_key="API_KEY",
temperature="0.8",
streaming=False
)This allows intelligent agents, consistent RAG pipelines, multi-level automation and complex decision-making processes to be realized without proprietary models.
Usage with curl
Even a simple CURL call is enough to send a request to the Managed AI Model and receive a response in the familiar chat completion format.
This allows you to test quickly, build initial integrations or integrate the model directly into existing scripts and workflows.
curl https://api.ai.nws.netways.de/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer API_KEY" \
-d '{
"model": "openai/gp-oss-120b",
"messages": [
{
"role": "user",
"content": "Erkläre mir kurz, was GPT-OSS-120B ist."
}
]
}'Use in OpenWebUI & other graphical frontends
For those who prefer to use AI via a graphical interface, OpenWebUI offers a modern and easy-to-understand solution. GPT-OSS can also be integrated there via API and is immediately ready for use. OpenWebUI also offers many practical functions:
- Web search: Answers can be expanded with up-to-date online information if required
- RAG support: integrate your own documents and receive context-based answers
- Create embeddings: for semantic search, knowledge systems or internal analysis functions
- Reranking: Sort search results or documents according to relevance
- Content extraction: Automatically extract content from PDFs, web pages or texts
Perfect for teams with a less technical background who want to use AI on a day-to-day basis. Non-technical users also benefit directly from the possibilities of our model and you can quickly and intuitively equip your team with AI without a local installation or complex infrastructure of your own.
Other compatible front ends:
- LibreChat
- Chatbot UI
- AnythingLLM
- MacAI
- LM Studio
- And many more
GDPR-compliant AI operation made in Germany
Our systems run on dedicated servers in ISO 27001-certified data centers in Germany.
This allows us to offer you full GDPR compliance, secure processing of sensitive data, transparent processes and, optionally, a customized infrastructure.
With our AI Individual solutions, we develop customized AI environments, from architecture and hosting to ongoing operations. Do you want to integrate AI into your processes, but don’t know exactly where to start?
We support you with:
- integration with tools and workflows
- setting up your own RAG or agent systems
- Hosting, scaling & monitoring
- Fine-tuning your own models
- Individual security or compliance requirements
Whether you need advice or help with implementation or operation, we are always at your disposal.





0 Comments