Welcome to the Nala Core API
Nala Core API is an opinionated, production-ready engine for building resilient, observable, and scalable microservices in Python. It provides a robust foundation—the Athomic Layer—that handles cross-cutting concerns, allowing developers to focus purely on business logic.
Our philosophy is grounded in battle-tested software engineering principles like SOLID, Single Responsibility (SRP), and Dependency Injection (DI), ensuring that applications built on this engine are maintainable, scalable, and easy to test.
Getting Started
Follow these steps to get a local development environment up and running.
Prerequisites
- Docker & Docker Compose
- Python 3.11+
- Poetry
1. Clone the Repository
git clone [https://github.com/guandaline/athomic-docs.git](https://github.com/guandaline/athomic-docs.git)
cd athomic-docs
2. Start Infrastructure Services
This command will start all necessary services defined in docker-compose.yml, such as
MongoDB, Redis, and Vault.
docker-compose up -d
3. Install Dependencies
Install the project's Python dependencies using Poetry.
poetry install
4. Run the Application
Execute the application using uvicorn. The server will start on
http://127.0.0.1:8000.
poetry run uvicorn nala.api.main:app --reload
You can now access the API documentation at http://127.0.0.1:8000/docs.
Explore the Documentation
- Architecture Overview: Dive deep into the layered architecture and design principles.
- Dependency Injection: Understand how services are managed.
Athomic Layer Modules
Core & Lifecycle
- Services: The base service lifecycle.
- Lifecycle Management: How services are started and stopped in order.
- Plugins: The extensible plugin system.
Configuration & Context
- Configuration Management: The
DynaconfandPydanticbased configuration system. - Context Management: Handling request-scoped context for tracing and multi-tenancy.
Data & Persistence
- Connection Management: The central manager for all data store connections.
- Document Stores: Abstraction for document databases like MongoDB.
- Key-Value Stores: Abstraction for KV stores like Redis, with a powerful wrapper system.
- Transactional Outbox: For guaranteeing at-least-once event delivery.
- Database Migrations: Version-controlled schema management.
- File Storage: Abstraction for object storage like GCS or local files.
Artificial Intelligence
- Agents & RAG: Built-in support for building AI agents and Retrieval-Augmented Generation systems.
- LLM Abstraction: Agnostic interfaces for various LLM providers to avoid vendor lock-in.
- Vector Stores: Integration with vector databases (e.g., Qdrant) for semantic search.
- Document Ingestion: Pipelines for ingesting and processing documents for AI consumption.
- Memory & Governance: State management for AI conversations and safety policies.
Security
- Secrets Management: Securely resolve secrets at runtime from backends like Vault.
- Authentication & Authorization: Policy-based security for endpoints using JWT or API Keys.
- Cryptography: High-level abstraction for symmetric encryption.
Observability
- Structured Logging:
Loguru-based logging with automatic sensitive data masking. - Distributed Tracing: End-to-end tracing with OpenTelemetry.
- Metrics: Out-of-the-box instrumentation with Prometheus.
- Health & Readiness: Extensible readiness probes for Kubernetes.
Cross-Cutting Concerns
- Serializer: Pluggable serializers (JSON, Protobuf) for data conversion.
- Payload Processing Pipeline: A "Pipes and Filters" system for composing transformations like encryption and compression.
- Notifications: Resiliently send emails via SMTP or other providers.
- Internal Event Bus: In-process Pub/Sub for decoupling internal components.
- Resilient HTTP Client: A factory for creating pre-configured, resilient clients.