Vibe Coding #3 - Building a Python Library for an API | European Energy Data ENTSO-E

Building a proper Python library for the ENTSO-E Transparency Platform API. From raw API wrapper to a publishable PyPI package with clean interface, type hints, and documentation.

February 11, 2026 18:00 CET 1:50:34
pythonentsoeapiopen-sourceenergylibrary

Result

A clean, typed Python library that turns the complex ENTSO-E XML API into pandas DataFrames — ready for analysis and published to PyPI.

from entsoe import Client

client = Client(api_key="your-token")
prices = client.prices.day_ahead(area="Spain", start="2026-02-01", end="2026-02-07")
# → DataFrame with hourly prices in EUR/MWh

The problem

  • The ENTSO-E REST API returns complex XML that requires manual parsing
  • Existing Python libraries use function names as parameters, making autocomplete difficult
  • It took me several attempts to figure out which area codes and parameters each query needed
  • In Vibe Coding #2 we built MCP tools — now we package the underlying API client as a reusable library

ENTSO-E Transparency Platform showing France generation mix by technology

The solution

We started from the basic API client used in Vibe Coding #2 for the MCP tools. The goal: turn it into a publishable library with a clean, intuitive interface.

API design

We began by studying the eia-py library (a client for the US energy API) as a design reference:

Python EIA library in action — exploring routes, facets, and filters with dot notation

What we liked: autocomplete, logical grouping of methods, no magic strings. The goal: bring the same experience to ENTSO-E.

Agenda slide showing the target namespace pattern: client.prices, client.generation

Claude Code proposed three design options:

  1. Option A: Flat methods (get_day_ahead_prices(), get_generation_mix(), etc.)
  2. Option B: Data-type classes (Prices, Generation, Transmission)
  3. Option C: Typed namespace accessors (client.prices.day_ahead(), client.generation.mix())

Claude Code proposing design options — Option C (typed namespace accessors) selected

We selected Option C — namespaces with full type hints to leverage IDE autocomplete and IntelliSense method discovery.

Building the library

Claude Code generated the entire library structure in a single pass:

Claude Code creating the library structure — namespaces, mappings, and HTTP client

First working results

We ran the first real query — France’s generation mix:

client = Client(api_key=os.getenv("ENTSOE_API_KEY"))
generation = client.generation.mix(area="France", start="2026-02-01", end="2026-02-07")

First successful query — generation mix DataFrame from France in one line of code

The result: a clean DataFrame with generation types as columns and timestamps as the index — ready for analysis.

Plotly charts showing actual and forecast generation by type in France

Namespace architecture

The library organizes methods into five main namespaces:

client.load          # Electricity demand (actual + forecast)
client.prices        # Day-ahead, intraday, clearing prices
client.generation    # Mix, forecast, installed capacity
client.transmission  # Cross-border flows, capacity
client.balancing     # Balancing prices, reserves

Each namespace maps to a dedicated class that encapsulates the query and transformation logic.

Area code handling

The ENTSO-E API uses EIC (Energy Identification Code) area codes — for example, 10YFR-RTE------C for France. To avoid making users memorize them, we implemented a resolution system:

# Supports names, ISO codes, and EIC codes
client.prices.day_ahead(area="Spain")     # → 10YES-REE------0
client.prices.day_ahead(area="ES")        # → 10YES-REE------0
client.prices.day_ahead(area="10YES-REE------0")  # → pass-through

Area code mappings — abstracting ENTSO-E codes into simple country codes like ES, FR, DE

Over 60 areas supported — from entire countries to regional bidding zones.

XML parsing → DataFrame

The API returns XML documents in GL_MarketDocument format. We implemented specialized parsers for each query type.

Each parser extracts time series from the XML and converts them into pandas DataFrames with typed columns and timestamps as the index.

Automatic query splitting

The ENTSO-E API has a one-year limit per query. For longer ranges, the library automatically splits into yearly queries and concatenates the results:

# 3-year query → automatically split into 3 requests
prices = client.prices.day_ahead(
    area="Germany",
    start="2023-01-01",
    end="2026-01-01"
)
# → DataFrame with 3 years of data

Retry logic and rate limiting

We implemented exponential backoff with jitter to handle network errors and rate limits:

@retry(
    stop=stop_after_attempt(3),
    wait=wait_exponential(multiplier=1, min=4, max=10) + wait_random(0, 2),
    retry=retry_if_exception_type((httpx.HTTPStatusError, httpx.RequestError))
)
def _make_request(self, params: dict) -> bytes:
    response = self.session.get(self.base_url, params=params)
    response.raise_for_status()
    return response.content

ZIP response handling

Some endpoints return ZIP-compressed responses. The library detects and decompresses automatically:

def _extract_xml(self, content: bytes) -> str:
    # Detect if response is ZIP
    if content.startswith(b'PK'):
        with zipfile.ZipFile(io.BytesIO(content)) as zf:
            xml_file = zf.namelist()[0]
            return zf.read(xml_file).decode('utf-8')
    return content.decode('utf-8')

Publishing to PyPI

We configured the project for publishing using uv:

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "python-entsoe"
version = "0.1.0"
description = "Typed Python client for ENTSO-E Transparency Platform API"
readme = "README.md"
requires-python = ">=3.11"
dependencies = [
    "httpx>=0.27.0",
    "pandas>=2.2.0",
    "tenacity>=9.0.0",
]

We published the package live on stream:

uv build
uv publish

Published in under 20 seconds:

python-entsoe published on PyPI — live in under 20 seconds

The package page on PyPI shows the README with installation instructions, quick start examples, and the full namespace reference:

PyPI package page with README documentation and usage examples

To validate everything works end-to-end, we installed from a completely separate project — pip install python-entsoe, import, query, plot. Everything worked on the first try:

Using pip install python-entsoe from a separate project — data plots working immediately

CI/CD with GitHub Actions

We set up GitHub Actions to automate testing and publishing:

GitHub Actions CI/CD pipeline — automated publish to PyPI on every push

The workflow runs tests on every push and automatically publishes to PyPI when a version tag is created.

Reference

Namespaces and key methods

NamespaceMethodParametersDescription
client.loadactual()area, start, endActual electricity demand (MW)
forecast()area, start, endDemand forecast (MW)
client.pricesday_ahead()area, start, endDay-ahead prices (EUR/MWh)
intraday()area, start, endIntraday prices (EUR/MWh)
client.generationmix()area, start, endGeneration mix by source (MW)
forecast()area, start, endGeneration forecast (MW)
installed()area, start, endInstalled capacity (MW)
client.transmissionflows()from_area, to_area, start, endCross-border flows (MW)
capacity()from_area, to_area, start, endInterconnection capacity (MW)
client.balancingprices()area, start, endBalancing prices (EUR/MWh)

Project structure

src/entsoe
__init__.py# Public exports
client.py# Main client with namespaces
_http.py# HTTP layer (httpx + retries)
_xml.py# XML → DataFrame parsers
_mappings.py# EIC area codes
exceptions.py# Custom exceptions
namespaces
_base.py# Base namespace class
load.py# Load namespace
prices.py# Prices namespace
generation.py# Generation namespace
transmission.py# Transmission namespace
balancing.py# Balancing namespace

Stack

  • Python 3.11+ / httpx / pandas / tenacity
  • uv (dependency management + build + publishing)
  • GitHub Actions (CI/CD)
  • PyPI (distribution)

Resources

Subscribe to our newsletter

Get weekly insights on data, automation, and AI.

© 2026 Datons. All rights reserved.