This directory contains the backend services for the Blockly TFLite conversion feature.
- Python 3.8 or higher
- pip (Python package manager)
OR
- Docker and Docker Compose
- Copy the environment file and configure:
cp .env.example .env
# Edit .env with your configuration- Build and run with Docker Compose:
docker-compose up -dThe service will be available at http://localhost:3000
To stop the service:
docker-compose downTo view logs:
docker-compose logs -f- Create a virtual environment (recommended):
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate- Install dependencies:
pip install -r requirements.txt- Configure environment variables (create .env file):
cp .env.example .env
# Edit .env with your configuration- Run the application:
python app.pyLocated in services/tflite_converter.py, this service provides functions to:
-
Convert TensorFlow.js models to TFLite format (
convert_tfjs_to_tflite)- Accepts a TensorFlow.js model directory path
- Converts to TensorFlow SavedModel (intermediate format)
- Applies TFLite converter with optional quantization
- Returns path to the generated .tflite file
-
Generate C/C++ byte arrays (
tflite_to_c_array)- Reads a TFLite binary file
- Generates formatted C/C++ code with const unsigned char array
- Includes metadata comments and size constants
- Returns formatted code string ready for microcontroller deployment
-
Extract model metadata (
get_model_info)- Reads TFLite model and extracts input/output shapes
- Returns dictionary with model information
from services.tflite_converter import convert_tfjs_to_tflite, tflite_to_c_array
# Convert TensorFlow.js model to TFLite
tflite_path = convert_tfjs_to_tflite(
model_path='path/to/tfjs/model',
output_path='output/model.tflite',
quantize=True
)
# Generate C/C++ byte array
cpp_code = tflite_to_c_array(
tflite_path=tflite_path,
array_name='my_model_data',
include_metadata=True
)
print(cpp_code)Run tests using pytest:
pytest tests/Run with coverage:
pytest --cov=services tests/The conversion service will be exposed via a REST API endpoint (to be implemented in task 2.1):
- POST /api/convert-to-tflite
- Accepts TensorFlow.js model data
- Returns formatted C/C++ code or error response
- The conversion process requires temporary disk space for intermediate files
- Large models (>10MB) may take longer to convert
- Quantization is enabled by default to reduce model size for microcontrollers
- The service automatically cleans up temporary files after conversion