Skip to content

AI Cloud (PaaS)#

Deep Thought took 7.5 million years to calculate "42." Your AI models produce answers in seconds — for any question you throw at them.

VeriTeknik AI Cloud provides access to large language models (LLMs) through a standard OpenAI-compatible API. You don't need to change your code to switch models; just change your subscription.

Supported Models#

The AI Cloud panel lists available models and their pricing. Models are updated regularly. General categories:

Category Examples
Large Language Models Llama, Mistral, Gemma, Qwen
Code Assistants DeepSeek Coder, CodeLlama
Multimodal Models Image understanding + text generation

Model catalog

For the current model list and token pricing, see AI Cloud > Models in your panel. Prices are shown per token in USD.

API Access#

When your PaaS subscription is active, you're assigned an API key. Use it to connect to VeriTeknik AI Cloud from any OpenAI SDK-compatible application or library.

API Endpoint:

https://api.veriteknik.com/v1

Example (Python openai library):

from openai import OpenAI

client = OpenAI(
    api_key="vt-...",
    base_url="https://api.veriteknik.com/v1"
)

response = client.chat.completions.create(
    model="meta-llama/Llama-3.2-3B-Instruct",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

OpenAI compatible

Set the base_url parameter to the VeriTeknik endpoint and your existing OpenAI code works without modification. Like hitchhiking across the galaxy: grab your towel and go.

Usage and Billing#

AI Cloud usage is billed per token. Input and output token prices are set separately for each model.

  • Usage summaries are available in the Billing section
  • Monthly usage breakdowns are sent by email
  • Add credits at Billing > Add Balance before your balance runs low

Monitor your balance

If your balance runs out, API requests are rejected. You receive advance warnings rather than sudden cutoffs — we pay attention to details that the Vogons don't.

Marketplace Templates#

For ready-to-use scenarios, select an application template from the Marketplace section. Morpheus AI analyzes your chosen template and recommends configuration parameters:

  1. Go to AI Cloud > Marketplace
  2. Select a template (e.g. RAG pipeline, chatbot, summarization service)
  3. Morpheus explains the recommended model and parameters
  4. Confirm — the configuration is applied to your account

Support#

For technical questions, open a ticket from the Support section. We can help with API integration.