Skip to content

Support configuring TTL for inferences #5533

@shuyangli

Description

@shuyangli

Currently we don't have any TTL for inferences, so it will grow unbounded. We should allow users to configure TTL for inferences and delete old inferences. We should decide whether to retain inferences with feedback attached.

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triageThis issue still needs to be triaged by the TensorZero team.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions